Client Report - Can You Predict That?

Course DS 250

Author

Jada Bower

Show the code
!pip install tensorflow
import pandas as pd 
import numpy as np
from lets_plot import *
import matplotlib.pyplot as plt
import xgboost as xgb
from xgboost import XGBClassifier, XGBRegressor
from sklearn.model_selection import train_test_split
from sklearn.metrics import accuracy_score, precision_score, recall_score, f1_score, confusion_matrix
import matplotlib.pyplot as plt
from sklearn.metrics import classification_report
from tensorflow.keras import Sequential
from tensorflow.keras.layers import Conv2D, MaxPooling2D, Dropout, Flatten, Dense
from sklearn.model_selection import train_test_split
from sklearn.metrics import root_mean_squared_error, r2_score
from sklearn.preprocessing import MinMaxScaler

LetsPlot.setup_html(isolated_frame=True)
Requirement already satisfied: tensorflow in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (2.18.0)
Requirement already satisfied: tensorflow-intel==2.18.0 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from tensorflow) (2.18.0)
Requirement already satisfied: absl-py>=1.0.0 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from tensorflow-intel==2.18.0->tensorflow) (2.1.0)
Requirement already satisfied: astunparse>=1.6.0 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from tensorflow-intel==2.18.0->tensorflow) (1.6.3)
Requirement already satisfied: flatbuffers>=24.3.25 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from tensorflow-intel==2.18.0->tensorflow) (25.2.10)
Requirement already satisfied: gast!=0.5.0,!=0.5.1,!=0.5.2,>=0.2.1 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from tensorflow-intel==2.18.0->tensorflow) (0.6.0)
Requirement already satisfied: google-pasta>=0.1.1 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from tensorflow-intel==2.18.0->tensorflow) (0.2.0)
Requirement already satisfied: libclang>=13.0.0 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from tensorflow-intel==2.18.0->tensorflow) (18.1.1)
Requirement already satisfied: opt-einsum>=2.3.2 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from tensorflow-intel==2.18.0->tensorflow) (3.4.0)
Requirement already satisfied: packaging in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from tensorflow-intel==2.18.0->tensorflow) (23.2)
Requirement already satisfied: protobuf!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<6.0.0dev,>=3.20.3 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from tensorflow-intel==2.18.0->tensorflow) (5.29.3)
Requirement already satisfied: requests<3,>=2.21.0 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from tensorflow-intel==2.18.0->tensorflow) (2.31.0)
Requirement already satisfied: setuptools in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from tensorflow-intel==2.18.0->tensorflow) (75.8.0)
Requirement already satisfied: six>=1.12.0 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from tensorflow-intel==2.18.0->tensorflow) (1.16.0)
Requirement already satisfied: termcolor>=1.1.0 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from tensorflow-intel==2.18.0->tensorflow) (2.5.0)
Requirement already satisfied: typing-extensions>=3.6.6 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from tensorflow-intel==2.18.0->tensorflow) (4.12.2)
Requirement already satisfied: wrapt>=1.11.0 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from tensorflow-intel==2.18.0->tensorflow) (1.17.2)
Requirement already satisfied: grpcio<2.0,>=1.24.3 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from tensorflow-intel==2.18.0->tensorflow) (1.70.0)
Requirement already satisfied: tensorboard<2.19,>=2.18 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from tensorflow-intel==2.18.0->tensorflow) (2.18.0)
Requirement already satisfied: keras>=3.5.0 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from tensorflow-intel==2.18.0->tensorflow) (3.9.0)
Requirement already satisfied: numpy<2.1.0,>=1.26.0 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from tensorflow-intel==2.18.0->tensorflow) (1.26.3)
Requirement already satisfied: h5py>=3.11.0 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from tensorflow-intel==2.18.0->tensorflow) (3.13.0)
Requirement already satisfied: ml-dtypes<0.5.0,>=0.4.0 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from tensorflow-intel==2.18.0->tensorflow) (0.4.1)
Requirement already satisfied: wheel<1.0,>=0.23.0 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from astunparse>=1.6.0->tensorflow-intel==2.18.0->tensorflow) (0.45.1)
Requirement already satisfied: rich in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from keras>=3.5.0->tensorflow-intel==2.18.0->tensorflow) (13.9.4)
Requirement already satisfied: namex in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from keras>=3.5.0->tensorflow-intel==2.18.0->tensorflow) (0.0.8)
Requirement already satisfied: optree in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from keras>=3.5.0->tensorflow-intel==2.18.0->tensorflow) (0.14.1)
Requirement already satisfied: charset-normalizer<4,>=2 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.18.0->tensorflow) (3.3.2)
Requirement already satisfied: idna<4,>=2.5 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.18.0->tensorflow) (3.6)
Requirement already satisfied: urllib3<3,>=1.21.1 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.18.0->tensorflow) (2.1.0)
Requirement already satisfied: certifi>=2017.4.17 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from requests<3,>=2.21.0->tensorflow-intel==2.18.0->tensorflow) (2023.11.17)
Requirement already satisfied: markdown>=2.6.8 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from tensorboard<2.19,>=2.18->tensorflow-intel==2.18.0->tensorflow) (3.7)
Requirement already satisfied: tensorboard-data-server<0.8.0,>=0.7.0 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from tensorboard<2.19,>=2.18->tensorflow-intel==2.18.0->tensorflow) (0.7.2)
Requirement already satisfied: werkzeug>=1.0.1 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from tensorboard<2.19,>=2.18->tensorflow-intel==2.18.0->tensorflow) (3.1.3)
Requirement already satisfied: MarkupSafe>=2.1.1 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from werkzeug>=1.0.1->tensorboard<2.19,>=2.18->tensorflow-intel==2.18.0->tensorflow) (3.0.2)
Requirement already satisfied: markdown-it-py>=2.2.0 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from rich->keras>=3.5.0->tensorflow-intel==2.18.0->tensorflow) (3.0.0)
Requirement already satisfied: pygments<3.0.0,>=2.13.0 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from rich->keras>=3.5.0->tensorflow-intel==2.18.0->tensorflow) (2.19.1)
Requirement already satisfied: mdurl~=0.1 in c:\users\jadab\appdata\local\programs\python\python312\lib\site-packages (from markdown-it-py>=2.2.0->rich->keras>=3.5.0->tensorflow-intel==2.18.0->tensorflow) (0.1.2)

Elevator pitch

A SHORT (2-3 SENTENCES) PARAGRAPH THAT DESCRIBES KEY INSIGHTS TAKEN FROM METRICS IN THE PROJECT RESULTS THINK TOP OR MOST IMPORTANT RESULTS. (Note: this is not a summary of the project, but a summary of the results.)

I was able to create a machine learning model that used features like if the home is one story and the quality of the building materials to decide whether the home was built before or after 1980 with an F1 score of about 0.94. When I added the additional features from the expanded dataset I got that F1 score up to about 0.97, so it did a lot better.

QUESTION|TASK 1

Create 2-3 charts that evaluate potential relationships between the home variables and before1980. Explain what you learn from the charts that could help a machine learning algorithm.

First, the vast majority of one-story homes were built since 1980. I did some research into why most one-story homes are of newer construction, and it seems to be because in the 1980’s and later is when the Baby Boomer generation started to need more accessible homes, and they didn’t need to be very large (as they were living mostly on their own or just as couples), so a lot more single-story homes were built to accommodate them, as their generation is massive proportionally.

Show the code
df = pd.read_csv("https://raw.githubusercontent.com/byuidatascience/data4dwellings/master/data-raw/dwellings_ml/dwellings_ml.csv")
Show the code
df_plotting = df.copy()
df_plotting['arcstyle_ONE-STORY'] = df_plotting['arcstyle_ONE-STORY'].astype('bool')
df_plotting['before1980'] = df_plotting['before1980'].astype('bool')

legend_labels = {
    'true': 'Before 1980',
    'false': 'Since 1980'
  }

(
  ggplot(df_plotting, aes(x="arcstyle_ONE-STORY",fill='before1980'))
  + geom_bar(stat='count', position='dodge')
  + scale_fill_manual(
      values = ['#e6272f', '#6d9ade'],
      name = "House Built:",
      labels = legend_labels
    )
  + labs(
      title="One Story Homes Before and After 1980",
      x="One Story",
      y='Number of Homes'
    )
  + theme(
      plot_title=element_text(size=20)
    )
)

Secondly, most homes without attached garages were built since 1980. I honestly cannot find anything about why this might be happening, but my guess would be that they are becoming less popular for aesthetic reasons.

Show the code
df_plotting['gartype_Att'] = df_plotting['gartype_Att'].astype('bool')

(
  ggplot(df_plotting, aes(x="gartype_Att",fill='before1980'))
  + geom_bar(stat='count', position='dodge')
  + scale_fill_manual(
      values = ['#e6272f', '#6d9ade'],
      name = "House Built:",
      labels = legend_labels
    )
  + labs(
      title="Homes with Attached Garages Before and After 1980",
      x="Attached Garage",
      y='Number of Homes'
    )
  + theme(
      plot_title=element_text(size=20)
    )
)

Thirdly, the data has a “Quality” scale from A-D and X, which basically says how high quality the home is. The majority of the homes in the dataset are in the “C” category, meaning decent quality homes, but not the best. Most of the homes in this C category were built in or after 1980, which could suggest a lot more pre-built homes where the materials used weren’t decided by the homeowner. After 1980 it became a lot cheaper (hence the lower-quality building materials) and more popular to buy a home pre-built rather than build it yourself.

Show the code
df_plotting['quality_C'] = df_plotting['quality_C'].astype('bool')

(
  ggplot(df_plotting, aes(x="quality_C",fill='before1980'))
  + geom_bar(stat='count', position='dodge')
  + scale_fill_manual(
      values = ['#e6272f', '#6d9ade'],
      name = "House Built:",
      labels = legend_labels
    )
  + labs(
      title='Homes of Quality "C" Before and After 1980',
      x="C-Quality",
      y='Number of Homes'
    )
  + theme(
      plot_title=element_text(size=20)
    )
)

QUESTION|TASK 2

Build a classification model labeling houses as being built “before 1980” or “during or after 1980”. Your goal is to reach or exceed 90% accuracy. Explain your final model choice (algorithm, tuning parameters, etc) and describe what other models you tried.

I went with an XGBClassifier model for this project because I am in the Machine Learning class right now and I remember my teacher in that class saying that XGBoost is often his go-to model if he needs to get things done quickly and well. For parameters I made the ‘objective’ ‘binary:hinge’, which basically tells it that the output should be either a 1 or a 0. The ‘eval_metric’ tells the model what to optimize, and the ‘error’ option is calculated as #(wrong cases)/#(all cases). I didn’t end up trying any other models because the XGBoost model got over 90% accuracy first try.

Show the code
features_in_order_of_importance = ['livearea','basement','netprice','numbaths','smonth','finbsmnt','numbdrm','tasp','deduct','abstrprd','nocars','gartype_Att','sprice','quality_C','status_I','quality_D','condition_AVG','arcstyle_ONE-STORY','arcstyle_MIDDLE UNIT','stories','syear','qualified_Q','gartype_Det','arcstyle_ONE AND HALF-STORY','arcstyle_END UNIT','arcstyle_TWO-STORY','condition_Good','quality_B','quality_A','totunits','condition_VGood','arcstyle_TRI-LEVEL','quality_X','gartype_det/CP','arcstyle_BI-LEVEL','arcstyle_THREE-STORY','condition_Excel','arcstyle_TRI-LEVEL WITH BASEMENT','arcstyle_CONVERSIONS','gartype_CP','arcstyle_TWO AND HALF-STORY','gartype_Att/Det','qualified_U','gartype_None','arcstyle_SPLIT LEVEL','condition_Fair','gartype_att/CP','status_V']

X = df[features_in_order_of_importance]
y = df['before1980']

X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

model = XGBClassifier(
    objective='binary:hinge',
    eval_metric='error',
    use_label_encoder=False
)

model.fit(X_train, y_train)
y_pred = model.predict(X_test)
Show the code
accuracy = accuracy_score(y_test, y_pred)
precision = precision_score(y_test, y_pred)
recall = recall_score(y_test, y_pred)
f1 = f1_score(y_test, y_pred)

print('Accuracy:', accuracy)
print('Precision:', precision)
print('Recall:', recall)
print('F1:', f1)
Accuracy: 0.9277765655684049
Precision: 0.9364697802197802
Recall: 0.9491820396797772
F1: 0.9427830596369923

QUESTION|TASK 3

Justify your classification model by discussing the most important features selected by your model. This discussion should include a feature importance chart and a description of the features.

I decided to do this part before part one, which is why I graphed the three most important features for that part. But as I already explained in part one, it makes sense that one story buildings would be the most important feature (because of the large population that is aging and needing single story homes). Basically I’m saying see task 1 for an explanation of the most important features.

Show the code
influences = (pd.DataFrame({'importance': model.feature_importances_,
                           'feature': X_train.columns})
                           .sort_values('importance')
                           .query('importance >= 0.02'))

(
ggplot(data = influences, mapping = aes(x = 'feature', y = 'importance'))
  + geom_bar(stat = 'identity') 
  + coord_flip()
  + labs(
    x = "Feature",
    y = "Importance",
    title = "XGBoost Classifier Feature Importance"
  )
  + theme(plot_title=element_text(size=15,hjust=-4.1))

)
Show the code
# importance_scores = model.feature_importances_
# feature_names = X_train.columns if hasattr(X_train, "columns") else np.arange(X_train.shape[1])
# sorted_indices = np.argsort(importance_scores)[::-1]

# for i in sorted_indices:
#     print(f"{feature_names[i]}: {importance_scores[i]}")

QUESTION|TASK 4

Describe the quality of your classification model using 2-3 different evaluation metrics. You also need to explain how to interpret each of the evaluation metrics you use.

You can see above (when I ran my model) what the accuracy, precision, recall, and f1 scores are for my model. Here is a quick explanation of each of these metrics:

Accuracy - The ratio of how many correct predictions to how many total predictions were made. This is not always the best metric because it does not tell us what the model is doing well on versus what it is failing on. What I mean is, if the data were 90% one category and 10% another category, the model could just predict the first category every time and have an accuracy score of 90%, which is decent. But the model would still not be very intelligent.

Precision - When a model makes a prediction of a certain category, what percent of the predictions for that category were correct? This is the precision. Having a high precision means the model had few “false positives”. But, focusing on just the precision score might induce a model to be very conservative in it’s “positive” predictions. It might decide to only guess a certain category if it is absolutely positive about it. That way the precision is very high.

Recall - Out of all the actual positives, how many did the model correctly identify? Focusing on recall can have the opposite effect as focusing on precision. Where precision makes the model more hesitant to guess “positive”, recall makes the model more likely to. Because if it guesses “positive” for all of them, the recall would be 100%.

F1 Score - F1 is a strange sort of balancing ratio between precision and recall that forces the model to focus on both rather than one or the other. A high F1 score means that the model performed well on both precision and recall. The F1 score is usually the most accurate to how your model is performing in general cases.

In this case, Recall might be the most useful thing to look for, because we would rather find all the houses with asbestos, even if we also include some that weren’t actually built before 1980. Better be safe and check all possible houses than only check a few of the houses and let some people go on living in homes with asbestos.


STRETCH QUESTION|TASK 1

Repeat the classification model using 3 different algorithms. Display their Feature Importance, and Decision Matrix. Explain the differences between the models and which one you would recommend to the Client.

I decided to run a decision tree, random forest, and neural network for my other three models. The decision tree is one of the simplest models, it just picks a feature and splits the data by that feature until it has a decent prediction for the data. The random forest basically runs a bunch of smaller trees and takes the most guessed category from the trees as the prediction. And the neural network is kind of complicated, but basically it takes in all the features as numbers, multiplies them all by some random number, and looks at how far the output is from the actual value. Then it changes the numbers slightly to try to reduce the loss and eventually gets better results. I think from this I would recommend the random forest because it performed the best in nearly all the evaluation metrics.

Show the code
# DECISION TREE
from sklearn.tree import DecisionTreeClassifier

tree = DecisionTreeClassifier()
tree.fit(X_train, y_train)
tree_pred = tree.predict(X_test)

tree_accuracy = accuracy_score(y_test, tree_pred)
tree_precision = precision_score(y_test, tree_pred)
tree_recall = recall_score(y_test, tree_pred)
tree_f1 = f1_score(y_test, tree_pred)

print('Decision Tree')
print('Accuracy:', tree_accuracy)
print('Precision:', tree_precision)
print('Recall:', tree_recall)
print('F1:', tree_f1)
print('Confusion Matrix:')
print(confusion_matrix(y_test, tree_pred))
Decision Tree
Accuracy: 0.9007200523674449
Precision: 0.92902767920511
Recall: 0.9112426035502958
F1: 0.9200492004920049
Confusion Matrix:
[[1510  200]
 [ 255 2618]]
Show the code
influences = (pd.DataFrame({'importance': tree.feature_importances_,
                           'feature': X_train.columns})
                           .sort_values('importance')
                           .query('importance >= 0.02'))

(
ggplot(data = influences, mapping = aes(x = 'feature', y = 'importance'))
  + geom_bar(stat = 'identity') 
  + coord_flip()
  + labs(
    x = "Feature",
    y = "Importance",
    title = "Decision Tree Classifier Feature Importance"
  )
  + theme(plot_title=element_text(size=15,hjust=-1.3))
)
Show the code
# RANDOM FOREST
from sklearn.ensemble import RandomForestClassifier

forest = RandomForestClassifier()
forest.fit(X_train, y_train)
forest_pred = forest.predict(X_test)

forest_accuracy = accuracy_score(y_test, forest_pred)
forest_precision = precision_score(y_test, forest_pred)
forest_recall = recall_score(y_test, forest_pred)
forest_f1 = f1_score(y_test, forest_pred)

print('Random Forest')
print('Accuracy:', forest_accuracy)
print('Precision:', forest_precision)
print('Recall:', forest_recall)
print('F1:', forest_f1)
print('Confusion Matrix:')
print(confusion_matrix(y_test, forest_pred))
Random Forest
Accuracy: 0.9314859262491818
Precision: 0.9428868120456906
Recall: 0.9481378350156631
F1: 0.9455050329746616
Confusion Matrix:
[[1545  165]
 [ 149 2724]]
Show the code
influences = (pd.DataFrame({'importance': forest.feature_importances_,
                           'feature': X_train.columns})
                           .sort_values('importance')
                           .query('importance >= 0.02'))

(
ggplot(data = influences, mapping = aes(x = 'feature', y = 'importance'))
  + geom_bar(stat = 'identity') 
  + coord_flip()
  + labs(
    x = "Feature",
    y = "Importance",
    title = "Random Forest Classifier Feature Importance"
  )
  + theme(plot_title=element_text(size=15,hjust=-1.45))
)
Show the code
# NEURAL NETWORK
import tensorflow as tf
from tensorflow import keras
from keras import Input, Model

norm = MinMaxScaler().fit(X_train)
X_train = norm.transform(X_train)
X_test = norm.transform(X_test)

model = Sequential()
model.add(Input(shape=(len(X_train[0]),)))
model.add(Dense(16, activation='relu'))
model.add(Dense(8, activation='relu'))
model.add(Dense(1, activation='sigmoid'))

opt = keras.optimizers.Adam()
model.compile(loss='binary_crossentropy', optimizer=opt, metrics=['accuracy'])

early_stop = keras.callbacks.EarlyStopping(monitor='val_loss', patience=10)

history = model.fit(X_train, y_train, epochs=2000, validation_split=.2, batch_size=32, callbacks=[early_stop],shuffle=False)

hist = pd.DataFrame(history.history)

hist = hist.reset_index()

predictions = model.predict(X_test)
binary_predictions = (predictions >= 0.5).astype(int)

nn_accuracy = accuracy_score(y_test, binary_predictions)
nn_precision = precision_score(y_test, binary_predictions)
nn_recall = recall_score(y_test, binary_predictions)
nn_f1 = f1_score(y_test, binary_predictions)

print('Neural Network')
print('Accuracy:', nn_accuracy)
print('Precision:', nn_precision)
print('Recall:', nn_recall)
print('F1:', nn_f1)
print('Confusion Matrix:')
print(confusion_matrix(y_test, binary_predictions))
Epoch 1/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 8:15 1s/step - accuracy: 0.5312 - loss: 0.6917 36/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.5745 - loss: 0.6620  74/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.6204 - loss: 0.6325112/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.6486 - loss: 0.6098149/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.6715 - loss: 0.5891194/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.6937 - loss: 0.5667238/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.7102 - loss: 0.5487282/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.7237 - loss: 0.5330320/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.7336 - loss: 0.5209352/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.7408 - loss: 0.5118395/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.7493 - loss: 0.5007439/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.7567 - loss: 0.4906459/459 ━━━━━━━━━━━━━━━━━━━━ 2s 2ms/step - accuracy: 0.7599 - loss: 0.4862 - val_accuracy: 0.8655 - val_loss: 0.3322
Epoch 2/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 14s 31ms/step - accuracy: 0.9375 - loss: 0.1530 40/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8664 - loss: 0.3067   79/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8678 - loss: 0.3076127/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8680 - loss: 0.3107178/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8687 - loss: 0.3117229/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8685 - loss: 0.3130281/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8683 - loss: 0.3139333/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8682 - loss: 0.3143384/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8683 - loss: 0.3143436/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8684 - loss: 0.3140459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8685 - loss: 0.3138 - val_accuracy: 0.8696 - val_loss: 0.3175
Epoch 3/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 12s 27ms/step - accuracy: 0.9688 - loss: 0.1234 45/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8707 - loss: 0.2928   99/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8708 - loss: 0.2965154/459 ━━━━━━━━━━━━━━━━━━━━ 0s 990us/step - accuracy: 0.8712 - loss: 0.2987208/459 ━━━━━━━━━━━━━━━━━━━━ 0s 978us/step - accuracy: 0.8714 - loss: 0.3000249/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8710 - loss: 0.3013  296/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8709 - loss: 0.3021342/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8709 - loss: 0.3025388/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8710 - loss: 0.3027435/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8711 - loss: 0.3026459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8712 - loss: 0.3024 - val_accuracy: 0.8726 - val_loss: 0.3111
Epoch 4/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 13s 30ms/step - accuracy: 0.9688 - loss: 0.1121 45/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8765 - loss: 0.2834   81/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8772 - loss: 0.2861108/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8772 - loss: 0.2879143/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8772 - loss: 0.2896176/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8772 - loss: 0.2905206/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8771 - loss: 0.2914239/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8765 - loss: 0.2927272/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8763 - loss: 0.2935302/459 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8761 - loss: 0.2941329/459 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8760 - loss: 0.2944362/459 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8759 - loss: 0.2947398/459 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8758 - loss: 0.2948431/459 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8758 - loss: 0.2949452/459 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8758 - loss: 0.2948459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.8758 - loss: 0.2948 - val_accuracy: 0.8773 - val_loss: 0.3066
Epoch 5/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 30s 66ms/step - accuracy: 0.9688 - loss: 0.1064 18/459 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8739 - loss: 0.2797   41/459 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8816 - loss: 0.2776 73/459 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8814 - loss: 0.2793104/459 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8807 - loss: 0.2817126/459 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8803 - loss: 0.2830161/459 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8800 - loss: 0.2843191/459 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8798 - loss: 0.2852218/459 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8794 - loss: 0.2862241/459 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8790 - loss: 0.2871269/459 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8787 - loss: 0.2879295/459 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8785 - loss: 0.2885320/459 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8784 - loss: 0.2889350/459 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8783 - loss: 0.2892377/459 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8782 - loss: 0.2894414/459 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8781 - loss: 0.2896449/459 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8780 - loss: 0.2896459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.8781 - loss: 0.2895 - val_accuracy: 0.8800 - val_loss: 0.3029
Epoch 6/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 16s 37ms/step - accuracy: 0.9688 - loss: 0.1049 42/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8815 - loss: 0.2728   78/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8804 - loss: 0.2751111/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8798 - loss: 0.2775144/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8797 - loss: 0.2792181/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8796 - loss: 0.2804221/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8794 - loss: 0.2820253/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8791 - loss: 0.2832290/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8790 - loss: 0.2841328/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8789 - loss: 0.2848360/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8789 - loss: 0.2851395/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8788 - loss: 0.2854423/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8788 - loss: 0.2855459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.8789 - loss: 0.2855 - val_accuracy: 0.8819 - val_loss: 0.2999
Epoch 7/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 16s 37ms/step - accuracy: 0.9688 - loss: 0.1026 39/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8823 - loss: 0.2687   78/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8809 - loss: 0.2709117/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8798 - loss: 0.2737153/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8796 - loss: 0.2756192/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8796 - loss: 0.2769230/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8793 - loss: 0.2786266/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8791 - loss: 0.2798291/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8790 - loss: 0.2805310/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8790 - loss: 0.2809331/459 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8790 - loss: 0.2812366/459 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8790 - loss: 0.2816404/459 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8790 - loss: 0.2819439/459 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8790 - loss: 0.2821459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.8791 - loss: 0.2820 - val_accuracy: 0.8816 - val_loss: 0.2977
Epoch 8/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 18s 41ms/step - accuracy: 0.9688 - loss: 0.0994 41/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8845 - loss: 0.2649   83/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8829 - loss: 0.2682122/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8816 - loss: 0.2709163/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8814 - loss: 0.2728209/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8811 - loss: 0.2745255/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8806 - loss: 0.2765298/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8805 - loss: 0.2777338/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8804 - loss: 0.2784378/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8804 - loss: 0.2789417/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8804 - loss: 0.2792458/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8804 - loss: 0.2793459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.8804 - loss: 0.2793 - val_accuracy: 0.8830 - val_loss: 0.2961
Epoch 9/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 13s 30ms/step - accuracy: 0.9688 - loss: 0.0964 38/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8882 - loss: 0.2617   76/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8867 - loss: 0.2643113/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8848 - loss: 0.2674153/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8840 - loss: 0.2696194/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8834 - loss: 0.2710236/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8827 - loss: 0.2730282/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8823 - loss: 0.2746328/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8820 - loss: 0.2756372/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8818 - loss: 0.2762413/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8817 - loss: 0.2766448/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8817 - loss: 0.2767459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.8817 - loss: 0.2767 - val_accuracy: 0.8830 - val_loss: 0.2944
Epoch 10/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 12s 27ms/step - accuracy: 0.9688 - loss: 0.0946 44/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8900 - loss: 0.2578   89/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8870 - loss: 0.2629138/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8852 - loss: 0.2660175/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8847 - loss: 0.2676213/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8841 - loss: 0.2692255/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8834 - loss: 0.2711299/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8831 - loss: 0.2724348/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8828 - loss: 0.2734387/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8826 - loss: 0.2739426/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8825 - loss: 0.2742459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.8825 - loss: 0.2742 - val_accuracy: 0.8838 - val_loss: 0.2926
Epoch 11/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 12s 27ms/step - accuracy: 0.9688 - loss: 0.0924 38/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8904 - loss: 0.2567   78/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8883 - loss: 0.2597128/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8861 - loss: 0.2635173/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8853 - loss: 0.2656223/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8844 - loss: 0.2678271/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8838 - loss: 0.2697320/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8834 - loss: 0.2710369/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8831 - loss: 0.2717423/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8829 - loss: 0.2722459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8829 - loss: 0.2723 - val_accuracy: 0.8835 - val_loss: 0.2911
Epoch 12/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 0.9688 - loss: 0.0912 48/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8910 - loss: 0.2535   96/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8877 - loss: 0.2593143/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8863 - loss: 0.2622192/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8856 - loss: 0.2641241/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8848 - loss: 0.2665287/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8844 - loss: 0.2681330/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8842 - loss: 0.2691376/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8839 - loss: 0.2698426/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8837 - loss: 0.2702459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8837 - loss: 0.2703 - val_accuracy: 0.8830 - val_loss: 0.2892
Epoch 13/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 13s 30ms/step - accuracy: 0.9688 - loss: 0.0901 48/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8930 - loss: 0.2514   94/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8895 - loss: 0.2574148/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8882 - loss: 0.2606197/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8875 - loss: 0.2625245/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8867 - loss: 0.2648298/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8863 - loss: 0.2666355/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8858 - loss: 0.2677410/459 ━━━━━━━━━━━━━━━━━━━━ 0s 993us/step - accuracy: 0.8856 - loss: 0.2683459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8855 - loss: 0.2685 - val_accuracy: 0.8835 - val_loss: 0.2884
Epoch 14/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 10s 23ms/step - accuracy: 1.0000 - loss: 0.0882 48/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8978 - loss: 0.2494  100/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8921 - loss: 0.2560153/459 ━━━━━━━━━━━━━━━━━━━━ 0s 992us/step - accuracy: 0.8901 - loss: 0.2592206/459 ━━━━━━━━━━━━━━━━━━━━ 0s 984us/step - accuracy: 0.8889 - loss: 0.2613252/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8880 - loss: 0.2635  306/459 ━━━━━━━━━━━━━━━━━━━━ 0s 993us/step - accuracy: 0.8873 - loss: 0.2651357/459 ━━━━━━━━━━━━━━━━━━━━ 0s 991us/step - accuracy: 0.8868 - loss: 0.2661410/459 ━━━━━━━━━━━━━━━━━━━━ 0s 987us/step - accuracy: 0.8864 - loss: 0.2667459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8863 - loss: 0.2669 - val_accuracy: 0.8838 - val_loss: 0.2870
Epoch 15/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 10s 23ms/step - accuracy: 1.0000 - loss: 0.0872 47/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8955 - loss: 0.2468   97/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8902 - loss: 0.2539141/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8888 - loss: 0.2567189/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8882 - loss: 0.2587232/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8875 - loss: 0.2609283/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8871 - loss: 0.2627332/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8867 - loss: 0.2639383/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8865 - loss: 0.2647434/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8863 - loss: 0.2651459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8863 - loss: 0.2652 - val_accuracy: 0.8830 - val_loss: 0.2860
Epoch 16/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 10s 23ms/step - accuracy: 1.0000 - loss: 0.0867 49/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8960 - loss: 0.2448  101/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8912 - loss: 0.2522153/459 ━━━━━━━━━━━━━━━━━━━━ 0s 997us/step - accuracy: 0.8900 - loss: 0.2555202/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8894 - loss: 0.2575  249/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8887 - loss: 0.2598299/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8883 - loss: 0.2615351/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8879 - loss: 0.2625400/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8877 - loss: 0.2632451/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8876 - loss: 0.2635459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8876 - loss: 0.2635 - val_accuracy: 0.8822 - val_loss: 0.2853
Epoch 17/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 10s 22ms/step - accuracy: 1.0000 - loss: 0.0872 50/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8944 - loss: 0.2431  101/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8903 - loss: 0.2506154/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8895 - loss: 0.2539197/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8892 - loss: 0.2556247/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8885 - loss: 0.2581293/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8881 - loss: 0.2597340/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8878 - loss: 0.2608394/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8875 - loss: 0.2616448/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8874 - loss: 0.2620459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8874 - loss: 0.2620 - val_accuracy: 0.8833 - val_loss: 0.2844
Epoch 18/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 10s 24ms/step - accuracy: 1.0000 - loss: 0.0857 51/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8965 - loss: 0.2414  100/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8928 - loss: 0.2489153/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8918 - loss: 0.2523203/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8913 - loss: 0.2543251/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8906 - loss: 0.2568295/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8902 - loss: 0.2583334/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8899 - loss: 0.2592370/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8896 - loss: 0.2598409/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8894 - loss: 0.2603450/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8893 - loss: 0.2606459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.8893 - loss: 0.2606 - val_accuracy: 0.8830 - val_loss: 0.2835
Epoch 19/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 1.0000 - loss: 0.0859 48/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8984 - loss: 0.2392   91/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8944 - loss: 0.2467135/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8932 - loss: 0.2499178/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8928 - loss: 0.2518224/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8922 - loss: 0.2540269/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8916 - loss: 0.2560310/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8912 - loss: 0.2572350/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8909 - loss: 0.2580389/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8906 - loss: 0.2586432/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8904 - loss: 0.2590459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.8904 - loss: 0.2591 - val_accuracy: 0.8838 - val_loss: 0.2831
Epoch 20/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 1.0000 - loss: 0.0855 45/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8977 - loss: 0.2365   89/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8942 - loss: 0.2444130/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8930 - loss: 0.2476172/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8927 - loss: 0.2497214/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8922 - loss: 0.2517257/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8915 - loss: 0.2538295/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8911 - loss: 0.2551335/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8908 - loss: 0.2561377/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8905 - loss: 0.2568418/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8903 - loss: 0.2573457/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8903 - loss: 0.2575459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.8903 - loss: 0.2575 - val_accuracy: 0.8841 - val_loss: 0.2821
Epoch 21/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 1.0000 - loss: 0.0858 43/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8966 - loss: 0.2346   88/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8934 - loss: 0.2426132/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8924 - loss: 0.2462169/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8923 - loss: 0.2481210/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8918 - loss: 0.2500248/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8913 - loss: 0.2520285/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8910 - loss: 0.2534320/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8908 - loss: 0.2544359/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8906 - loss: 0.2551399/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8905 - loss: 0.2557440/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8904 - loss: 0.2561459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.8905 - loss: 0.2562 - val_accuracy: 0.8849 - val_loss: 0.2813
Epoch 22/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 13s 30ms/step - accuracy: 1.0000 - loss: 0.0860 43/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8974 - loss: 0.2326   86/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8942 - loss: 0.2406129/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8931 - loss: 0.2445181/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8928 - loss: 0.2472239/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8922 - loss: 0.2502291/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8917 - loss: 0.2523342/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8914 - loss: 0.2535396/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8912 - loss: 0.2544450/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8912 - loss: 0.2549459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8912 - loss: 0.2550 - val_accuracy: 0.8865 - val_loss: 0.2804
Epoch 23/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 10s 23ms/step - accuracy: 1.0000 - loss: 0.0861 45/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8975 - loss: 0.2309   91/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8944 - loss: 0.2400145/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8940 - loss: 0.2443197/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8939 - loss: 0.2467249/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8934 - loss: 0.2495299/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8930 - loss: 0.2514352/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8926 - loss: 0.2526405/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8924 - loss: 0.2535459/459 ━━━━━━━━━━━━━━━━━━━━ 0s 997us/step - accuracy: 0.8923 - loss: 0.2539459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8923 - loss: 0.2539 - val_accuracy: 0.8871 - val_loss: 0.2796
Epoch 24/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 10s 24ms/step - accuracy: 1.0000 - loss: 0.0859 49/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8959 - loss: 0.2303   99/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8935 - loss: 0.2398154/459 ━━━━━━━━━━━━━━━━━━━━ 0s 989us/step - accuracy: 0.8933 - loss: 0.2438205/459 ━━━━━━━━━━━━━━━━━━━━ 0s 987us/step - accuracy: 0.8934 - loss: 0.2462257/459 ━━━━━━━━━━━━━━━━━━━━ 0s 986us/step - accuracy: 0.8929 - loss: 0.2489310/459 ━━━━━━━━━━━━━━━━━━━━ 0s 982us/step - accuracy: 0.8926 - loss: 0.2508363/459 ━━━━━━━━━━━━━━━━━━━━ 0s 976us/step - accuracy: 0.8924 - loss: 0.2519409/459 ━━━━━━━━━━━━━━━━━━━━ 0s 991us/step - accuracy: 0.8922 - loss: 0.2526450/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8922 - loss: 0.2530  459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8922 - loss: 0.2530 - val_accuracy: 0.8863 - val_loss: 0.2789
Epoch 25/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 11s 24ms/step - accuracy: 1.0000 - loss: 0.0857 51/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8974 - loss: 0.2292  101/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8943 - loss: 0.2387156/459 ━━━━━━━━━━━━━━━━━━━━ 0s 979us/step - accuracy: 0.8941 - loss: 0.2428210/459 ━━━━━━━━━━━━━━━━━━━━ 0s 970us/step - accuracy: 0.8940 - loss: 0.2454261/459 ━━━━━━━━━━━━━━━━━━━━ 0s 972us/step - accuracy: 0.8935 - loss: 0.2481312/459 ━━━━━━━━━━━━━━━━━━━━ 0s 975us/step - accuracy: 0.8932 - loss: 0.2498356/459 ━━━━━━━━━━━━━━━━━━━━ 0s 995us/step - accuracy: 0.8929 - loss: 0.2508407/459 ━━━━━━━━━━━━━━━━━━━━ 0s 995us/step - accuracy: 0.8927 - loss: 0.2516451/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8927 - loss: 0.2520  459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8927 - loss: 0.2520 - val_accuracy: 0.8882 - val_loss: 0.2780
Epoch 26/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 10s 23ms/step - accuracy: 1.0000 - loss: 0.0845 51/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8957 - loss: 0.2280  103/459 ━━━━━━━━━━━━━━━━━━━━ 0s 994us/step - accuracy: 0.8934 - loss: 0.2378143/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8933 - loss: 0.2410  187/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8934 - loss: 0.2431237/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8930 - loss: 0.2459289/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8928 - loss: 0.2481341/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8925 - loss: 0.2495390/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8923 - loss: 0.2503436/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8923 - loss: 0.2509459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8923 - loss: 0.2510 - val_accuracy: 0.8884 - val_loss: 0.2774
Epoch 27/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 11s 24ms/step - accuracy: 1.0000 - loss: 0.0836 47/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8977 - loss: 0.2259   99/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8947 - loss: 0.2364143/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8944 - loss: 0.2400197/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8944 - loss: 0.2426248/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8940 - loss: 0.2455300/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8936 - loss: 0.2475350/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8933 - loss: 0.2488400/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8931 - loss: 0.2496447/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8930 - loss: 0.2501459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8930 - loss: 0.2502 - val_accuracy: 0.8879 - val_loss: 0.2769
Epoch 28/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 12s 27ms/step - accuracy: 1.0000 - loss: 0.0833 41/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8982 - loss: 0.2244   84/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8959 - loss: 0.2334133/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8954 - loss: 0.2385183/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8955 - loss: 0.2412237/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8952 - loss: 0.2442290/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8948 - loss: 0.2464341/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8945 - loss: 0.2478386/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8943 - loss: 0.2486440/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8941 - loss: 0.2493459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8941 - loss: 0.2494 - val_accuracy: 0.8890 - val_loss: 0.2765
Epoch 29/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 10s 22ms/step - accuracy: 1.0000 - loss: 0.0829 52/459 ━━━━━━━━━━━━━━━━━━━━ 0s 983us/step - accuracy: 0.8978 - loss: 0.2250107/459 ━━━━━━━━━━━━━━━━━━━━ 0s 954us/step - accuracy: 0.8959 - loss: 0.2353160/459 ━━━━━━━━━━━━━━━━━━━━ 0s 952us/step - accuracy: 0.8959 - loss: 0.2393210/459 ━━━━━━━━━━━━━━━━━━━━ 0s 963us/step - accuracy: 0.8959 - loss: 0.2418261/459 ━━━━━━━━━━━━━━━━━━━━ 0s 970us/step - accuracy: 0.8954 - loss: 0.2445315/459 ━━━━━━━━━━━━━━━━━━━━ 0s 963us/step - accuracy: 0.8951 - loss: 0.2464369/459 ━━━━━━━━━━━━━━━━━━━━ 0s 960us/step - accuracy: 0.8948 - loss: 0.2476425/459 ━━━━━━━━━━━━━━━━━━━━ 0s 951us/step - accuracy: 0.8946 - loss: 0.2484459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8946 - loss: 0.2487 - val_accuracy: 0.8887 - val_loss: 0.2759
Epoch 30/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 10s 24ms/step - accuracy: 1.0000 - loss: 0.0826 51/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8977 - loss: 0.2235  103/459 ━━━━━━━━━━━━━━━━━━━━ 0s 997us/step - accuracy: 0.8959 - loss: 0.2339158/459 ━━━━━━━━━━━━━━━━━━━━ 0s 973us/step - accuracy: 0.8960 - loss: 0.2382213/459 ━━━━━━━━━━━━━━━━━━━━ 0s 956us/step - accuracy: 0.8960 - loss: 0.2410269/459 ━━━━━━━━━━━━━━━━━━━━ 0s 947us/step - accuracy: 0.8955 - loss: 0.2440322/459 ━━━━━━━━━━━━━━━━━━━━ 0s 947us/step - accuracy: 0.8952 - loss: 0.2457370/459 ━━━━━━━━━━━━━━━━━━━━ 0s 960us/step - accuracy: 0.8949 - loss: 0.2468425/459 ━━━━━━━━━━━━━━━━━━━━ 0s 955us/step - accuracy: 0.8947 - loss: 0.2476459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8947 - loss: 0.2479 - val_accuracy: 0.8893 - val_loss: 0.2752
Epoch 31/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 11s 24ms/step - accuracy: 1.0000 - loss: 0.0816 41/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8980 - loss: 0.2210   85/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8963 - loss: 0.2307130/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8962 - loss: 0.2355180/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8964 - loss: 0.2385228/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8962 - loss: 0.2413276/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8959 - loss: 0.2436323/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8956 - loss: 0.2451370/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8954 - loss: 0.2461423/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8952 - loss: 0.2470459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8952 - loss: 0.2472 - val_accuracy: 0.8884 - val_loss: 0.2747
Epoch 32/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 10s 22ms/step - accuracy: 1.0000 - loss: 0.0811 44/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8980 - loss: 0.2205   87/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8961 - loss: 0.2304131/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8959 - loss: 0.2349176/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8962 - loss: 0.2376223/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8961 - loss: 0.2403272/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8957 - loss: 0.2428319/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8955 - loss: 0.2444370/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8952 - loss: 0.2456421/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8950 - loss: 0.2464459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8950 - loss: 0.2466 - val_accuracy: 0.8898 - val_loss: 0.2741
Epoch 33/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 10s 24ms/step - accuracy: 1.0000 - loss: 0.0814 46/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8980 - loss: 0.2202  102/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8958 - loss: 0.2319154/459 ━━━━━━━━━━━━━━━━━━━━ 0s 991us/step - accuracy: 0.8960 - loss: 0.2361198/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8961 - loss: 0.2384  241/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8959 - loss: 0.2410286/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8957 - loss: 0.2429329/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8954 - loss: 0.2443377/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8953 - loss: 0.2453427/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8951 - loss: 0.2460459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8951 - loss: 0.2462 - val_accuracy: 0.8893 - val_loss: 0.2739
Epoch 34/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 1.0000 - loss: 0.0805 49/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8983 - loss: 0.2198   93/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8963 - loss: 0.2300144/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8964 - loss: 0.2347195/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8966 - loss: 0.2374245/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8963 - loss: 0.2404292/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8960 - loss: 0.2424346/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8957 - loss: 0.2439396/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8954 - loss: 0.2449437/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8953 - loss: 0.2454459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8953 - loss: 0.2455 - val_accuracy: 0.8893 - val_loss: 0.2734
Epoch 35/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 10s 24ms/step - accuracy: 1.0000 - loss: 0.0800 45/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8997 - loss: 0.2183   99/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8973 - loss: 0.2299149/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8972 - loss: 0.2342200/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8972 - loss: 0.2369255/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8968 - loss: 0.2401308/459 ━━━━━━━━━━━━━━━━━━━━ 0s 993us/step - accuracy: 0.8964 - loss: 0.2422355/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8961 - loss: 0.2434  406/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8958 - loss: 0.2443459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8957 - loss: 0.2448 - val_accuracy: 0.8884 - val_loss: 0.2736
Epoch 36/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 10s 22ms/step - accuracy: 1.0000 - loss: 0.0800 47/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9004 - loss: 0.2180  100/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8979 - loss: 0.2295155/459 ━━━━━━━━━━━━━━━━━━━━ 0s 986us/step - accuracy: 0.8977 - loss: 0.2341209/459 ━━━━━━━━━━━━━━━━━━━━ 0s 975us/step - accuracy: 0.8976 - loss: 0.2370260/459 ━━━━━━━━━━━━━━━━━━━━ 0s 980us/step - accuracy: 0.8970 - loss: 0.2399307/459 ━━━━━━━━━━━━━━━━━━━━ 0s 993us/step - accuracy: 0.8966 - loss: 0.2417361/459 ━━━━━━━━━━━━━━━━━━━━ 0s 984us/step - accuracy: 0.8962 - loss: 0.2431418/459 ━━━━━━━━━━━━━━━━━━━━ 0s 972us/step - accuracy: 0.8959 - loss: 0.2440459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8958 - loss: 0.2444 - val_accuracy: 0.8895 - val_loss: 0.2730
Epoch 37/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 11s 24ms/step - accuracy: 1.0000 - loss: 0.0797 51/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8993 - loss: 0.2185  101/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8974 - loss: 0.2289153/459 ━━━━━━━━━━━━━━━━━━━━ 0s 990us/step - accuracy: 0.8975 - loss: 0.2333196/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8975 - loss: 0.2355  247/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8970 - loss: 0.2386299/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8964 - loss: 0.2408349/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8961 - loss: 0.2422399/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8958 - loss: 0.2431449/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8956 - loss: 0.2437459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8956 - loss: 0.2437 - val_accuracy: 0.8887 - val_loss: 0.2731
Epoch 38/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 10s 24ms/step - accuracy: 1.0000 - loss: 0.0797 45/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8994 - loss: 0.2165   94/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8974 - loss: 0.2275139/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8973 - loss: 0.2317187/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8974 - loss: 0.2344241/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8969 - loss: 0.2377297/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8963 - loss: 0.2401349/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8959 - loss: 0.2416403/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8957 - loss: 0.2426459/459 ━━━━━━━━━━━━━━━━━━━━ 0s 999us/step - accuracy: 0.8955 - loss: 0.2432459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8955 - loss: 0.2432 - val_accuracy: 0.8890 - val_loss: 0.2727
Epoch 39/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 1.0000 - loss: 0.0802 52/459 ━━━━━━━━━━━━━━━━━━━━ 0s 998us/step - accuracy: 0.8982 - loss: 0.2177106/459 ━━━━━━━━━━━━━━━━━━━━ 0s 969us/step - accuracy: 0.8965 - loss: 0.2282161/459 ━━━━━━━━━━━━━━━━━━━━ 0s 950us/step - accuracy: 0.8969 - loss: 0.2326216/459 ━━━━━━━━━━━━━━━━━━━━ 0s 944us/step - accuracy: 0.8969 - loss: 0.2357269/459 ━━━━━━━━━━━━━━━━━━━━ 0s 948us/step - accuracy: 0.8964 - loss: 0.2385323/459 ━━━━━━━━━━━━━━━━━━━━ 0s 947us/step - accuracy: 0.8960 - loss: 0.2404374/459 ━━━━━━━━━━━━━━━━━━━━ 0s 954us/step - accuracy: 0.8958 - loss: 0.2416425/459 ━━━━━━━━━━━━━━━━━━━━ 0s 958us/step - accuracy: 0.8955 - loss: 0.2424459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8955 - loss: 0.2426 - val_accuracy: 0.8901 - val_loss: 0.2730
Epoch 40/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 1.0000 - loss: 0.0801 50/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8970 - loss: 0.2168  103/459 ━━━━━━━━━━━━━━━━━━━━ 0s 987us/step - accuracy: 0.8962 - loss: 0.2274154/459 ━━━━━━━━━━━━━━━━━━━━ 0s 991us/step - accuracy: 0.8968 - loss: 0.2317206/459 ━━━━━━━━━━━━━━━━━━━━ 0s 988us/step - accuracy: 0.8970 - loss: 0.2344257/459 ━━━━━━━━━━━━━━━━━━━━ 0s 990us/step - accuracy: 0.8966 - loss: 0.2374306/459 ━━━━━━━━━━━━━━━━━━━━ 0s 995us/step - accuracy: 0.8963 - loss: 0.2393353/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8960 - loss: 0.2406  404/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8958 - loss: 0.2415455/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8956 - loss: 0.2421459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8956 - loss: 0.2421 - val_accuracy: 0.8898 - val_loss: 0.2722
Epoch 41/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 10s 23ms/step - accuracy: 1.0000 - loss: 0.0800 52/459 ━━━━━━━━━━━━━━━━━━━━ 0s 999us/step - accuracy: 0.8979 - loss: 0.2168105/459 ━━━━━━━━━━━━━━━━━━━━ 0s 972us/step - accuracy: 0.8965 - loss: 0.2272146/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8968 - loss: 0.2308  197/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8970 - loss: 0.2336247/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8967 - loss: 0.2367290/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8964 - loss: 0.2385338/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8961 - loss: 0.2400383/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8959 - loss: 0.2409435/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8957 - loss: 0.2416459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8957 - loss: 0.2418 - val_accuracy: 0.8903 - val_loss: 0.2727
Epoch 42/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 1.0000 - loss: 0.0792 55/459 ━━━━━━━━━━━━━━━━━━━━ 0s 944us/step - accuracy: 0.8970 - loss: 0.2170109/459 ━━━━━━━━━━━━━━━━━━━━ 0s 941us/step - accuracy: 0.8965 - loss: 0.2270167/459 ━━━━━━━━━━━━━━━━━━━━ 0s 919us/step - accuracy: 0.8971 - loss: 0.2314219/459 ━━━━━━━━━━━━━━━━━━━━ 0s 935us/step - accuracy: 0.8971 - loss: 0.2345267/459 ━━━━━━━━━━━━━━━━━━━━ 0s 955us/step - accuracy: 0.8967 - loss: 0.2370314/459 ━━━━━━━━━━━━━━━━━━━━ 0s 972us/step - accuracy: 0.8964 - loss: 0.2387369/459 ━━━━━━━━━━━━━━━━━━━━ 0s 963us/step - accuracy: 0.8962 - loss: 0.2401422/459 ━━━━━━━━━━━━━━━━━━━━ 0s 961us/step - accuracy: 0.8960 - loss: 0.2410459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8959 - loss: 0.2412 - val_accuracy: 0.8912 - val_loss: 0.2724
Epoch 43/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 10s 24ms/step - accuracy: 1.0000 - loss: 0.0787 50/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8975 - loss: 0.2144   97/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8964 - loss: 0.2246153/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8967 - loss: 0.2296205/459 ━━━━━━━━━━━━━━━━━━━━ 0s 996us/step - accuracy: 0.8968 - loss: 0.2326255/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8964 - loss: 0.2356  310/459 ━━━━━━━━━━━━━━━━━━━━ 0s 987us/step - accuracy: 0.8960 - loss: 0.2378360/459 ━━━━━━━━━━━━━━━━━━━━ 0s 991us/step - accuracy: 0.8958 - loss: 0.2392408/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1000us/step - accuracy: 0.8956 - loss: 0.2400451/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8955 - loss: 0.2405   459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8956 - loss: 0.2405 - val_accuracy: 0.8906 - val_loss: 0.2718
Epoch 44/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 15s 33ms/step - accuracy: 1.0000 - loss: 0.0788 57/459 ━━━━━━━━━━━━━━━━━━━━ 0s 903us/step - accuracy: 0.8959 - loss: 0.2161104/459 ━━━━━━━━━━━━━━━━━━━━ 0s 986us/step - accuracy: 0.8960 - loss: 0.2250155/459 ━━━━━━━━━━━━━━━━━━━━ 0s 987us/step - accuracy: 0.8966 - loss: 0.2295216/459 ━━━━━━━━━━━━━━━━━━━━ 0s 941us/step - accuracy: 0.8968 - loss: 0.2331269/459 ━━━━━━━━━━━━━━━━━━━━ 0s 943us/step - accuracy: 0.8964 - loss: 0.2360323/459 ━━━━━━━━━━━━━━━━━━━━ 0s 941us/step - accuracy: 0.8961 - loss: 0.2380376/459 ━━━━━━━━━━━━━━━━━━━━ 0s 942us/step - accuracy: 0.8959 - loss: 0.2392431/459 ━━━━━━━━━━━━━━━━━━━━ 0s 940us/step - accuracy: 0.8958 - loss: 0.2400459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8959 - loss: 0.2402 - val_accuracy: 0.8912 - val_loss: 0.2714
Epoch 45/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 10s 23ms/step - accuracy: 1.0000 - loss: 0.0790 49/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8966 - loss: 0.2135   98/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8963 - loss: 0.2240148/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8968 - loss: 0.2286180/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8971 - loss: 0.2306229/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8970 - loss: 0.2336280/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8968 - loss: 0.2361318/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8966 - loss: 0.2375355/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8964 - loss: 0.2384408/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8963 - loss: 0.2394459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8963 - loss: 0.2399 - val_accuracy: 0.8917 - val_loss: 0.2714
Epoch 46/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 12s 27ms/step - accuracy: 1.0000 - loss: 0.0789 44/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8958 - loss: 0.2116   86/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8960 - loss: 0.2214139/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8968 - loss: 0.2271188/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8972 - loss: 0.2301244/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8970 - loss: 0.2336294/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8968 - loss: 0.2359344/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8965 - loss: 0.2374394/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8964 - loss: 0.2384449/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8963 - loss: 0.2391459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8963 - loss: 0.2392 - val_accuracy: 0.8914 - val_loss: 0.2713
Epoch 47/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 13s 29ms/step - accuracy: 1.0000 - loss: 0.0781 49/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8981 - loss: 0.2124   95/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8984 - loss: 0.2226147/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8988 - loss: 0.2275191/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8988 - loss: 0.2302245/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8984 - loss: 0.2336288/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8981 - loss: 0.2355340/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8977 - loss: 0.2371383/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8975 - loss: 0.2380430/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8973 - loss: 0.2388459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8972 - loss: 0.2390 - val_accuracy: 0.8923 - val_loss: 0.2715
Epoch 48/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 13s 30ms/step - accuracy: 1.0000 - loss: 0.0786 48/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8975 - loss: 0.2119   91/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8980 - loss: 0.2217140/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8987 - loss: 0.2267186/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8987 - loss: 0.2295242/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8983 - loss: 0.2331286/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8980 - loss: 0.2351336/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8976 - loss: 0.2367383/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8973 - loss: 0.2377427/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8971 - loss: 0.2384459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8971 - loss: 0.2386 - val_accuracy: 0.8925 - val_loss: 0.2710
Epoch 49/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 1.0000 - loss: 0.0774 48/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8990 - loss: 0.2114   98/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8994 - loss: 0.2220135/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8996 - loss: 0.2258185/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8995 - loss: 0.2290233/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8991 - loss: 0.2321288/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8986 - loss: 0.2347341/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8982 - loss: 0.2364391/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8979 - loss: 0.2374441/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8977 - loss: 0.2381459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8977 - loss: 0.2382 - val_accuracy: 0.8925 - val_loss: 0.2709
Epoch 50/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 1.0000 - loss: 0.0771 51/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8997 - loss: 0.2120  106/459 ━━━━━━━━━━━━━━━━━━━━ 0s 961us/step - accuracy: 0.9000 - loss: 0.2226154/459 ━━━━━━━━━━━━━━━━━━━━ 0s 986us/step - accuracy: 0.9001 - loss: 0.2270209/459 ━━━━━━━━━━━━━━━━━━━━ 0s 966us/step - accuracy: 0.8999 - loss: 0.2302269/459 ━━━━━━━━━━━━━━━━━━━━ 0s 941us/step - accuracy: 0.8994 - loss: 0.2337311/459 ━━━━━━━━━━━━━━━━━━━━ 0s 978us/step - accuracy: 0.8990 - loss: 0.2352368/459 ━━━━━━━━━━━━━━━━━━━━ 0s 964us/step - accuracy: 0.8987 - loss: 0.2367409/459 ━━━━━━━━━━━━━━━━━━━━ 0s 989us/step - accuracy: 0.8985 - loss: 0.2374459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8983 - loss: 0.2379 - val_accuracy: 0.8933 - val_loss: 0.2707
Epoch 51/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 16s 35ms/step - accuracy: 1.0000 - loss: 0.0773 39/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8993 - loss: 0.2095   94/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9004 - loss: 0.2208138/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9005 - loss: 0.2252189/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9004 - loss: 0.2285240/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8998 - loss: 0.2317283/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8994 - loss: 0.2337333/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8989 - loss: 0.2354389/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8986 - loss: 0.2366439/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8983 - loss: 0.2373459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8983 - loss: 0.2374 - val_accuracy: 0.8928 - val_loss: 0.2714
Epoch 52/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 1.0000 - loss: 0.0774 49/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9011 - loss: 0.2100   99/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9015 - loss: 0.2203150/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9013 - loss: 0.2252204/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9009 - loss: 0.2285258/459 ━━━━━━━━━━━━━━━━━━━━ 0s 989us/step - accuracy: 0.9001 - loss: 0.2318311/459 ━━━━━━━━━━━━━━━━━━━━ 0s 982us/step - accuracy: 0.8994 - loss: 0.2340357/459 ━━━━━━━━━━━━━━━━━━━━ 0s 997us/step - accuracy: 0.8990 - loss: 0.2353409/459 ━━━━━━━━━━━━━━━━━━━━ 0s 994us/step - accuracy: 0.8986 - loss: 0.2363458/459 ━━━━━━━━━━━━━━━━━━━━ 0s 998us/step - accuracy: 0.8984 - loss: 0.2368459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8984 - loss: 0.2368 - val_accuracy: 0.8942 - val_loss: 0.2713
Epoch 53/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 13s 30ms/step - accuracy: 1.0000 - loss: 0.0774 43/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9037 - loss: 0.2092   90/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9029 - loss: 0.2192142/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9025 - loss: 0.2244188/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9021 - loss: 0.2272226/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9017 - loss: 0.2297272/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9010 - loss: 0.2321324/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9004 - loss: 0.2340374/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9000 - loss: 0.2353429/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8996 - loss: 0.2362459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8995 - loss: 0.2364 - val_accuracy: 0.8944 - val_loss: 0.2712
Epoch 54/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 13s 30ms/step - accuracy: 1.0000 - loss: 0.0762 54/459 ━━━━━━━━━━━━━━━━━━━━ 0s 948us/step - accuracy: 0.9027 - loss: 0.2111111/459 ━━━━━━━━━━━━━━━━━━━━ 0s 912us/step - accuracy: 0.9025 - loss: 0.2213167/459 ━━━━━━━━━━━━━━━━━━━━ 0s 909us/step - accuracy: 0.9022 - loss: 0.2258217/459 ━━━━━━━━━━━━━━━━━━━━ 0s 937us/step - accuracy: 0.9017 - loss: 0.2288266/459 ━━━━━━━━━━━━━━━━━━━━ 0s 956us/step - accuracy: 0.9010 - loss: 0.2316316/459 ━━━━━━━━━━━━━━━━━━━━ 0s 965us/step - accuracy: 0.9003 - loss: 0.2335364/459 ━━━━━━━━━━━━━━━━━━━━ 0s 976us/step - accuracy: 0.8998 - loss: 0.2348412/459 ━━━━━━━━━━━━━━━━━━━━ 0s 985us/step - accuracy: 0.8994 - loss: 0.2357455/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8992 - loss: 0.2361  459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8992 - loss: 0.2361 - val_accuracy: 0.8942 - val_loss: 0.2706
Epoch 55/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 15s 34ms/step - accuracy: 1.0000 - loss: 0.0763 53/459 ━━━━━━━━━━━━━━━━━━━━ 0s 980us/step - accuracy: 0.9041 - loss: 0.2101 94/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9038 - loss: 0.2189  142/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9033 - loss: 0.2237196/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9026 - loss: 0.2270241/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9019 - loss: 0.2300285/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9013 - loss: 0.2321340/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9006 - loss: 0.2339388/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9002 - loss: 0.2349413/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8999 - loss: 0.2354459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.8997 - loss: 0.2358 - val_accuracy: 0.8936 - val_loss: 0.2706
Epoch 56/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 12s 27ms/step - accuracy: 1.0000 - loss: 0.0759 45/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9058 - loss: 0.2078   98/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9052 - loss: 0.2188157/459 ━━━━━━━━━━━━━━━━━━━━ 0s 976us/step - accuracy: 0.9043 - loss: 0.2243203/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9036 - loss: 0.2271  255/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9026 - loss: 0.2303304/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9018 - loss: 0.2324350/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9012 - loss: 0.2338406/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9006 - loss: 0.2349454/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9003 - loss: 0.2354459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9003 - loss: 0.2355 - val_accuracy: 0.8950 - val_loss: 0.2710
Epoch 57/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 13s 29ms/step - accuracy: 1.0000 - loss: 0.0754 33/459 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9059 - loss: 0.2055   64/459 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9065 - loss: 0.2128103/459 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9059 - loss: 0.2194151/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9049 - loss: 0.2239200/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9040 - loss: 0.2268251/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9030 - loss: 0.2300295/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9023 - loss: 0.2320346/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9015 - loss: 0.2336396/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9010 - loss: 0.2346451/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9006 - loss: 0.2353459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9006 - loss: 0.2353 - val_accuracy: 0.8936 - val_loss: 0.2711
Epoch 58/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 13s 29ms/step - accuracy: 1.0000 - loss: 0.0755 47/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9082 - loss: 0.2081   98/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9075 - loss: 0.2186148/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9065 - loss: 0.2233194/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9055 - loss: 0.2261244/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9043 - loss: 0.2293301/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9031 - loss: 0.2318349/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9024 - loss: 0.2333399/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9018 - loss: 0.2343455/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9013 - loss: 0.2350459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9013 - loss: 0.2350 - val_accuracy: 0.8933 - val_loss: 0.2712
Epoch 59/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 10s 23ms/step - accuracy: 1.0000 - loss: 0.0750 50/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9076 - loss: 0.2087   97/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9069 - loss: 0.2184153/459 ━━━━━━━━━━━━━━━━━━━━ 0s 989us/step - accuracy: 0.9058 - loss: 0.2236207/459 ━━━━━━━━━━━━━━━━━━━━ 0s 974us/step - accuracy: 0.9048 - loss: 0.2269259/459 ━━━━━━━━━━━━━━━━━━━━ 0s 972us/step - accuracy: 0.9036 - loss: 0.2300314/459 ━━━━━━━━━━━━━━━━━━━━ 0s 962us/step - accuracy: 0.9027 - loss: 0.2322373/459 ━━━━━━━━━━━━━━━━━━━━ 0s 945us/step - accuracy: 0.9019 - loss: 0.2337418/459 ━━━━━━━━━━━━━━━━━━━━ 0s 965us/step - accuracy: 0.9014 - loss: 0.2345459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9012 - loss: 0.2348 - val_accuracy: 0.8936 - val_loss: 0.2711
Epoch 60/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 1.0000 - loss: 0.0746 52/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9084 - loss: 0.2090  106/459 ━━━━━━━━━━━━━━━━━━━━ 0s 971us/step - accuracy: 0.9076 - loss: 0.2191162/459 ━━━━━━━━━━━━━━━━━━━━ 0s 948us/step - accuracy: 0.9063 - loss: 0.2240213/459 ━━━━━━━━━━━━━━━━━━━━ 0s 957us/step - accuracy: 0.9052 - loss: 0.2271261/459 ━━━━━━━━━━━━━━━━━━━━ 0s 972us/step - accuracy: 0.9041 - loss: 0.2300317/459 ━━━━━━━━━━━━━━━━━━━━ 0s 960us/step - accuracy: 0.9031 - loss: 0.2321370/459 ━━━━━━━━━━━━━━━━━━━━ 0s 958us/step - accuracy: 0.9024 - loss: 0.2335419/459 ━━━━━━━━━━━━━━━━━━━━ 0s 967us/step - accuracy: 0.9018 - loss: 0.2343459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9016 - loss: 0.2347 - val_accuracy: 0.8925 - val_loss: 0.2711
Epoch 61/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 1.0000 - loss: 0.0746 46/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9101 - loss: 0.2078   97/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9091 - loss: 0.2182144/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9078 - loss: 0.2227195/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9065 - loss: 0.2258243/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9052 - loss: 0.2288296/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9041 - loss: 0.2312344/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9033 - loss: 0.2327394/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9027 - loss: 0.2337442/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9022 - loss: 0.2344459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9021 - loss: 0.2345 - val_accuracy: 0.8944 - val_loss: 0.2718
Epoch 62/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 10s 24ms/step - accuracy: 1.0000 - loss: 0.0741 48/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9090 - loss: 0.2077   99/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9084 - loss: 0.2180153/459 ━━━━━━━━━━━━━━━━━━━━ 0s 998us/step - accuracy: 0.9072 - loss: 0.2230199/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9062 - loss: 0.2257  249/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9050 - loss: 0.2288299/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9040 - loss: 0.2310348/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9032 - loss: 0.2325402/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9026 - loss: 0.2335457/459 ━━━━━━━━━━━━━━━━━━━━ 0s 996us/step - accuracy: 0.9022 - loss: 0.2342459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9021 - loss: 0.2342 - val_accuracy: 0.8931 - val_loss: 0.2716
Epoch 63/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 10s 23ms/step - accuracy: 1.0000 - loss: 0.0749 49/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9081 - loss: 0.2083  103/459 ━━━━━━━━━━━━━━━━━━━━ 0s 997us/step - accuracy: 0.9079 - loss: 0.2185129/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9074 - loss: 0.2211  171/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9066 - loss: 0.2241214/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9057 - loss: 0.2267262/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9046 - loss: 0.2294316/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9036 - loss: 0.2315372/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9029 - loss: 0.2329421/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9024 - loss: 0.2337459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9021 - loss: 0.2340 - val_accuracy: 0.8939 - val_loss: 0.2714
Epoch 64/2000
  1/459 ━━━━━━━━━━━━━━━━━━━━ 12s 27ms/step - accuracy: 1.0000 - loss: 0.0753 51/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9087 - loss: 0.2085  104/459 ━━━━━━━━━━━━━━━━━━━━ 0s 977us/step - accuracy: 0.9079 - loss: 0.2183154/459 ━━━━━━━━━━━━━━━━━━━━ 0s 985us/step - accuracy: 0.9069 - loss: 0.2228206/459 ━━━━━━━━━━━━━━━━━━━━ 0s 989us/step - accuracy: 0.9058 - loss: 0.2259254/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9048 - loss: 0.2288  303/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9040 - loss: 0.2308354/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9032 - loss: 0.2323400/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9028 - loss: 0.2332457/459 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9024 - loss: 0.2338459/459 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9024 - loss: 0.2338 - val_accuracy: 0.8942 - val_loss: 0.2715
  1/144 ━━━━━━━━━━━━━━━━━━━━ 5s 38ms/step 89/144 ━━━━━━━━━━━━━━━━━━━━ 0s 576us/step144/144 ━━━━━━━━━━━━━━━━━━━━ 0s 729us/step144/144 ━━━━━━━━━━━━━━━━━━━━ 0s 806us/step
Neural Network
Accuracy: 0.8991926685577133
Precision: 0.9027731373204143
Recall: 0.9404803341454925
F1: 0.9212410501193318
Confusion Matrix:
[[1419  291]
 [ 171 2702]]

It doesn’t work the same way to create a graph of feature importance for the NN. I am not sure if there is a way to do this or if it even makes sense to try to.

STRETCH QUESTION|TASK 2

Join the dwellings_neighborhoods_ml.csv data to the dwelling_ml.csv on the parcel column to create a new dataset. Duplicate the code for the stretch question above and update it to use this data. Explain the differences and if this changes the model you recommend to the Client.

The additional features made a big difference in the performance of all the different types of models. I added an additional set of scores below for each model: how much better the model performed than their previous versions (the same model without the additional features). This shows that across all the models adding more features helped a lot. I would recommend the Random Forest to the Client because of the models I tested, it performed the best on the evaluation metrics explained in task 4.

Show the code
df_neiborhoods = pd.read_csv("https://raw.githubusercontent.com/byuidatascience/data4dwellings/master/data-raw/dwellings_neighborhoods_ml/dwellings_neighborhoods_ml.csv")
large_df = pd.merge(df, df_neiborhoods, on="parcel")

large_X = large_df.drop(columns = ['parcel', 'yrbuilt', 'before1980'])
large_y = large_df.before1980

X_large_train, X_large_test, y_large_train, y_large_test = train_test_split(large_X, large_y, test_size=0.2, random_state=42)
Show the code
# XGBoost
model_large = XGBClassifier(
    objective='binary:hinge',
    eval_metric='error',
    use_label_encoder=False
)

model_large.fit(X_large_train, y_large_train)
y_large_pred = model_large.predict(X_large_test)

accuracy_large = accuracy_score(y_large_test, y_large_pred)
precision_large = precision_score(y_large_test, y_large_pred)
recall_large = recall_score(y_large_test, y_large_pred)
f1_large = f1_score(y_large_test, y_large_pred)

print('Large Dataset XGBoost')
print('Accuracy:', accuracy_large)
print('Precision:', precision_large)
print('Recall:', recall_large)
print('F1:', f1_large)
print()
print("Difference of scores before and after adding data:")
print('Accuracy Increase:', accuracy_large - accuracy)
print('Precision Increase:', precision_large - precision)
print('Recall Increase:', recall_large - recall)
print('F1 Increase:', f1_large - f1)
Large Dataset XGBoost
Accuracy: 0.9672805292329698
Precision: 0.9676958261863923
Recall: 0.9797395079594791
F1: 0.9736804257155185

Difference of scores before and after adding data:
Accuracy Increase: 0.03950396366456488
Precision Increase: 0.031226045966612048
Recall Increase: 0.03055746827970185
F1 Increase: 0.030897366078526223
Show the code
# DECISION TREE
large_tree = DecisionTreeClassifier()
large_tree.fit(X_large_train, y_large_train)
large_tree_pred = large_tree.predict(X_large_test)

large_tree_accuracy = accuracy_score(y_large_test, large_tree_pred)
large_tree_precision = precision_score(y_large_test, large_tree_pred)
large_tree_recall = recall_score(y_large_test, large_tree_pred)
large_tree_f1 = f1_score(y_large_test, large_tree_pred)

print('Large Dataset Decision Tree')
print('Accuracy:', large_tree_accuracy)
print('Precision:', large_tree_precision)
print('Recall:', large_tree_recall)
print('F1:', large_tree_f1)
print()
print("Difference of scores before and after adding data:")
print('Accuracy Increase:', large_tree_accuracy - tree_accuracy)
print('Precision Increase:', large_tree_precision - tree_precision)
print('Recall Increase:', large_tree_recall - tree_recall)
print('F1 Increase:', large_tree_f1 - tree_f1)
Large Dataset Decision Tree
Accuracy: 0.9594135526551046
Precision: 0.965666474321985
Recall: 0.9687409551374819
F1: 0.9672012714925589

Difference of scores before and after adding data:
Accuracy Increase: 0.05869350028765974
Precision Increase: 0.03663879511687507
Recall Increase: 0.05749835158718608
F1 Increase: 0.04715207100055396
Show the code
# RANDOM FOREST
large_forest = RandomForestClassifier()
large_forest.fit(X_large_train, y_large_train)
large_forest_pred = large_forest.predict(X_large_test)

large_forest_accuracy = accuracy_score(y_large_test, large_forest_pred)
large_forest_precision = precision_score(y_large_test, large_forest_pred)
large_forest_recall = recall_score(y_large_test, large_forest_pred)
large_forest_f1 = f1_score(y_large_test, large_forest_pred)

print('Large Dataset Random Forest')
print('Accuracy:', large_forest_accuracy)
print('Precision:', large_forest_precision)
print('Recall:', large_forest_recall)
print('F1:', large_forest_f1)
print()
print("Difference of scores before and after adding data:")
print('Accuracy Increase:', large_forest_accuracy - forest_accuracy)
print('Precision Increase:', large_forest_precision - forest_precision)
print('Recall Increase:', large_forest_recall - forest_recall)
print('F1 Increase:', large_forest_f1 - forest_f1)
Large Dataset Random Forest
Accuracy: 0.9690684784552118
Precision: 0.9753765932792584
Recall: 0.9745296671490593
F1: 0.9749529462863762

Difference of scores before and after adding data:
Accuracy Increase: 0.03758255220603002
Precision Increase: 0.03248978123356783
Recall Increase: 0.026391832133396242
F1 Increase: 0.02944791331171459
Show the code
# NEURAL NETWORK
large_norm = MinMaxScaler().fit(X_large_train)
X_large_train = large_norm.transform(X_large_train)
X_large_test = large_norm.transform(X_large_test)

model = Sequential()
model.add(Input(shape=(len(X_large_train[0]),)))
model.add(Dense(16, activation='relu'))
model.add(Dense(8, activation='relu'))
model.add(Dense(1, activation='sigmoid'))

opt = keras.optimizers.Adam()
model.compile(loss='binary_crossentropy', optimizer=opt, metrics=['accuracy'])

early_stop = keras.callbacks.EarlyStopping(monitor='val_loss', patience=10)

history = model.fit(X_large_train, y_large_train, epochs=2000, validation_split=.2, batch_size=32, callbacks=[early_stop],shuffle=False)

hist = pd.DataFrame(history.history)

hist = hist.reset_index()

predictions = model.predict(X_large_test)
binary_predictions = (predictions >= 0.5).astype(int)

large_nn_accuracy = accuracy_score(y_large_test, binary_predictions)
large_nn_precision = precision_score(y_large_test, binary_predictions)
large_nn_recall = recall_score(y_large_test, binary_predictions)
large_nn_f1 = f1_score(y_large_test, binary_predictions)

print('Large Dataset Neural Network')
print('Accuracy:', large_nn_accuracy)
print('Precision:', large_nn_precision)
print('Recall:', large_nn_recall)
print('F1:', large_nn_f1)
print()
print("Difference of scores before and after adding data:")
print('Accuracy Increase:', large_nn_accuracy - nn_accuracy)
print('Precision Increase:', large_nn_precision - nn_precision)
print('Recall Increase:', large_nn_recall - nn_recall)
print('F1 Increase:', large_nn_f1 - nn_f1)
Epoch 1/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 7:02 756ms/step - accuracy: 0.4062 - loss: 0.7044 41/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.5755 - loss: 0.6791     89/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.6604 - loss: 0.6415144/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.7099 - loss: 0.5959193/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.7374 - loss: 0.5608245/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.7587 - loss: 0.5293286/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.7719 - loss: 0.5078332/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.7839 - loss: 0.4871384/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.7952 - loss: 0.4670438/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8049 - loss: 0.4490496/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8137 - loss: 0.4321550/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.8208 - loss: 0.4181560/560 ━━━━━━━━━━━━━━━━━━━━ 2s 2ms/step - accuracy: 0.8222 - loss: 0.4154 - val_accuracy: 0.9381 - val_loss: 0.1644
Epoch 2/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 19s 34ms/step - accuracy: 0.9375 - loss: 0.1512 45/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9310 - loss: 0.1726   96/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9349 - loss: 0.1677146/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9371 - loss: 0.1635197/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9379 - loss: 0.1611245/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9385 - loss: 0.1592294/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9390 - loss: 0.1577341/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9393 - loss: 0.1567389/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9395 - loss: 0.1560445/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9397 - loss: 0.1555497/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9398 - loss: 0.1551546/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9399 - loss: 0.1546560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9399 - loss: 0.1545 - val_accuracy: 0.9450 - val_loss: 0.1454
Epoch 3/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 16s 29ms/step - accuracy: 0.9375 - loss: 0.1129 43/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9357 - loss: 0.1518   96/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9404 - loss: 0.1476149/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9431 - loss: 0.1440192/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9440 - loss: 0.1424244/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9449 - loss: 0.1407296/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9456 - loss: 0.1395347/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9460 - loss: 0.1388391/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9462 - loss: 0.1384445/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9463 - loss: 0.1383489/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9463 - loss: 0.1383542/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9464 - loss: 0.1382560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9464 - loss: 0.1382 - val_accuracy: 0.9475 - val_loss: 0.1412
Epoch 4/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 17s 31ms/step - accuracy: 0.9375 - loss: 0.0998 52/560 ━━━━━━━━━━━━━━━━━━━━ 0s 995us/step - accuracy: 0.9413 - loss: 0.1434102/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1000us/step - accuracy: 0.9442 - loss: 0.1405155/560 ━━━━━━━━━━━━━━━━━━━━ 0s 985us/step - accuracy: 0.9464 - loss: 0.1375 204/560 ━━━━━━━━━━━━━━━━━━━━ 0s 995us/step - accuracy: 0.9474 - loss: 0.1359253/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9483 - loss: 0.1345  297/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9489 - loss: 0.1335345/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9492 - loss: 0.1328396/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9494 - loss: 0.1325446/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9494 - loss: 0.1325497/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9494 - loss: 0.1325547/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9494 - loss: 0.1325560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9494 - loss: 0.1325 - val_accuracy: 0.9468 - val_loss: 0.1397
Epoch 5/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 15s 27ms/step - accuracy: 1.0000 - loss: 0.0948 50/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9496 - loss: 0.1404  101/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9500 - loss: 0.1374154/560 ━━━━━━━━━━━━━━━━━━━━ 0s 987us/step - accuracy: 0.9512 - loss: 0.1344202/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9515 - loss: 0.1330  253/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9521 - loss: 0.1315311/560 ━━━━━━━━━━━━━━━━━━━━ 0s 981us/step - accuracy: 0.9524 - loss: 0.1303356/560 ━━━━━━━━━━━━━━━━━━━━ 0s 998us/step - accuracy: 0.9525 - loss: 0.1297401/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9525 - loss: 0.1294  457/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1000us/step - accuracy: 0.9523 - loss: 0.1295503/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9522 - loss: 0.1295   546/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9521 - loss: 0.1295560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9521 - loss: 0.1295 - val_accuracy: 0.9464 - val_loss: 0.1392
Epoch 6/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 15s 27ms/step - accuracy: 1.0000 - loss: 0.0919 48/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9510 - loss: 0.1389   90/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9512 - loss: 0.1366135/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9525 - loss: 0.1335186/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9530 - loss: 0.1316238/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9535 - loss: 0.1301291/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9538 - loss: 0.1288336/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9539 - loss: 0.1281389/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9538 - loss: 0.1277445/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9536 - loss: 0.1276493/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9534 - loss: 0.1277548/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9532 - loss: 0.1277560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9532 - loss: 0.1277 - val_accuracy: 0.9477 - val_loss: 0.1389
Epoch 7/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 15s 28ms/step - accuracy: 1.0000 - loss: 0.0895 48/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9505 - loss: 0.1375   95/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9510 - loss: 0.1347143/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9523 - loss: 0.1316188/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9527 - loss: 0.1302234/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9532 - loss: 0.1288278/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9535 - loss: 0.1276315/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9536 - loss: 0.1270353/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9537 - loss: 0.1265395/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9537 - loss: 0.1262433/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9535 - loss: 0.1262471/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9534 - loss: 0.1263502/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9533 - loss: 0.1263542/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9532 - loss: 0.1263560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.9531 - loss: 0.1263 - val_accuracy: 0.9484 - val_loss: 0.1388
Epoch 8/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 14s 25ms/step - accuracy: 1.0000 - loss: 0.0872 47/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9492 - loss: 0.1362   90/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9497 - loss: 0.1339136/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9513 - loss: 0.1308185/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9519 - loss: 0.1291231/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9526 - loss: 0.1277271/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9530 - loss: 0.1266313/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9532 - loss: 0.1259355/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9534 - loss: 0.1254400/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9534 - loss: 0.1251442/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9533 - loss: 0.1251495/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9532 - loss: 0.1252543/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9531 - loss: 0.1252560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9531 - loss: 0.1252 - val_accuracy: 0.9488 - val_loss: 0.1386
Epoch 9/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 15s 28ms/step - accuracy: 1.0000 - loss: 0.0862 44/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9492 - loss: 0.1357   85/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9498 - loss: 0.1331129/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9513 - loss: 0.1300165/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9519 - loss: 0.1285209/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9524 - loss: 0.1273252/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9529 - loss: 0.1261300/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9533 - loss: 0.1250343/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9536 - loss: 0.1244381/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9536 - loss: 0.1241425/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9536 - loss: 0.1240470/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9535 - loss: 0.1241515/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9534 - loss: 0.1241560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9533 - loss: 0.1242 - val_accuracy: 0.9495 - val_loss: 0.1384
Epoch 10/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 14s 25ms/step - accuracy: 1.0000 - loss: 0.0849 49/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9492 - loss: 0.1339  105/560 ━━━━━━━━━━━━━━━━━━━━ 0s 981us/step - accuracy: 0.9506 - loss: 0.1307154/560 ━━━━━━━━━━━━━━━━━━━━ 0s 999us/step - accuracy: 0.9519 - loss: 0.1279203/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9524 - loss: 0.1265  247/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9530 - loss: 0.1253301/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9534 - loss: 0.1241343/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9537 - loss: 0.1235399/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9537 - loss: 0.1231439/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9537 - loss: 0.1232492/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9536 - loss: 0.1232544/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9535 - loss: 0.1233560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9535 - loss: 0.1233 - val_accuracy: 0.9493 - val_loss: 0.1383
Epoch 11/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 17s 32ms/step - accuracy: 1.0000 - loss: 0.0833 47/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9493 - loss: 0.1331   93/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9500 - loss: 0.1307140/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9515 - loss: 0.1276192/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9521 - loss: 0.1260234/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9527 - loss: 0.1247273/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9532 - loss: 0.1238311/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9535 - loss: 0.1231356/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9537 - loss: 0.1226405/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9537 - loss: 0.1223442/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9537 - loss: 0.1223474/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9537 - loss: 0.1224509/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9536 - loss: 0.1224544/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9536 - loss: 0.1224560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.9535 - loss: 0.1224 - val_accuracy: 0.9490 - val_loss: 0.1380
Epoch 12/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 18s 33ms/step - accuracy: 1.0000 - loss: 0.0809 42/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9494 - loss: 0.1327   83/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9500 - loss: 0.1300131/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9517 - loss: 0.1268167/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9522 - loss: 0.1254221/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9529 - loss: 0.1239271/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9536 - loss: 0.1226324/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9540 - loss: 0.1217370/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9542 - loss: 0.1213411/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9542 - loss: 0.1211455/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9542 - loss: 0.1212498/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9542 - loss: 0.1212537/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9542 - loss: 0.1213560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9541 - loss: 0.1213 - val_accuracy: 0.9495 - val_loss: 0.1373
Epoch 13/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 15s 27ms/step - accuracy: 1.0000 - loss: 0.0791 50/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9507 - loss: 0.1298   96/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9517 - loss: 0.1275140/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9529 - loss: 0.1249180/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9533 - loss: 0.1237214/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9537 - loss: 0.1227255/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9542 - loss: 0.1217294/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9545 - loss: 0.1208330/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9547 - loss: 0.1203367/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9548 - loss: 0.1200404/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9548 - loss: 0.1198448/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9548 - loss: 0.1198485/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9547 - loss: 0.1199526/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9547 - loss: 0.1199560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.9546 - loss: 0.1200 - val_accuracy: 0.9502 - val_loss: 0.1364
Epoch 14/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 16s 30ms/step - accuracy: 1.0000 - loss: 0.0763 45/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9514 - loss: 0.1286   83/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9524 - loss: 0.1268122/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9537 - loss: 0.1242168/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9543 - loss: 0.1225216/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9547 - loss: 0.1211265/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9552 - loss: 0.1199311/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9554 - loss: 0.1191353/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9555 - loss: 0.1186393/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9555 - loss: 0.1184436/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9555 - loss: 0.1184484/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9554 - loss: 0.1185534/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9553 - loss: 0.1185560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9552 - loss: 0.1186 - val_accuracy: 0.9508 - val_loss: 0.1359
Epoch 15/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 14s 25ms/step - accuracy: 1.0000 - loss: 0.0724 48/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9519 - loss: 0.1271   94/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9530 - loss: 0.1246139/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9542 - loss: 0.1219183/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9545 - loss: 0.1206225/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9550 - loss: 0.1194261/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9553 - loss: 0.1185300/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9556 - loss: 0.1177342/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9557 - loss: 0.1172376/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9558 - loss: 0.1169421/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9558 - loss: 0.1168472/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9557 - loss: 0.1169515/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9556 - loss: 0.1170551/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9555 - loss: 0.1170560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.9555 - loss: 0.1170 - val_accuracy: 0.9508 - val_loss: 0.1349
Epoch 16/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 16s 29ms/step - accuracy: 1.0000 - loss: 0.0690 48/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9510 - loss: 0.1253  103/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1000us/step - accuracy: 0.9530 - loss: 0.1221146/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9540 - loss: 0.1198   190/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9545 - loss: 0.1186239/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9552 - loss: 0.1173291/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9556 - loss: 0.1161342/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9559 - loss: 0.1154387/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9560 - loss: 0.1151435/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9560 - loss: 0.1151479/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9559 - loss: 0.1152521/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9558 - loss: 0.1152560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9558 - loss: 0.1153 - val_accuracy: 0.9511 - val_loss: 0.1341
Epoch 17/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 16s 29ms/step - accuracy: 1.0000 - loss: 0.0664 46/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9508 - loss: 0.1237   97/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9525 - loss: 0.1208150/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9540 - loss: 0.1180202/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9546 - loss: 0.1166258/560 ━━━━━━━━━━━━━━━━━━━━ 0s 979us/step - accuracy: 0.9552 - loss: 0.1151313/560 ━━━━━━━━━━━━━━━━━━━━ 0s 970us/step - accuracy: 0.9557 - loss: 0.1141364/560 ━━━━━━━━━━━━━━━━━━━━ 0s 972us/step - accuracy: 0.9559 - loss: 0.1135410/560 ━━━━━━━━━━━━━━━━━━━━ 0s 986us/step - accuracy: 0.9559 - loss: 0.1133454/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9559 - loss: 0.1134  495/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9558 - loss: 0.1135545/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9558 - loss: 0.1135560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9558 - loss: 0.1135 - val_accuracy: 0.9531 - val_loss: 0.1325
Epoch 18/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 16s 30ms/step - accuracy: 1.0000 - loss: 0.0642 35/560 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9528 - loss: 0.1233   71/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9535 - loss: 0.1200108/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9548 - loss: 0.1180150/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9557 - loss: 0.1159193/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9560 - loss: 0.1148226/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9563 - loss: 0.1139265/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9566 - loss: 0.1129306/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9568 - loss: 0.1122348/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9569 - loss: 0.1117394/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9570 - loss: 0.1114440/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9569 - loss: 0.1114485/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9568 - loss: 0.1115531/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9567 - loss: 0.1115560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.9567 - loss: 0.1115 - val_accuracy: 0.9535 - val_loss: 0.1313
Epoch 19/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 12s 23ms/step - accuracy: 1.0000 - loss: 0.0604 57/560 ━━━━━━━━━━━━━━━━━━━━ 0s 893us/step - accuracy: 0.9546 - loss: 0.1186111/560 ━━━━━━━━━━━━━━━━━━━━ 0s 913us/step - accuracy: 0.9559 - loss: 0.1157160/560 ━━━━━━━━━━━━━━━━━━━━ 0s 950us/step - accuracy: 0.9566 - loss: 0.1136210/560 ━━━━━━━━━━━━━━━━━━━━ 0s 966us/step - accuracy: 0.9569 - loss: 0.1124259/560 ━━━━━━━━━━━━━━━━━━━━ 0s 977us/step - accuracy: 0.9572 - loss: 0.1112305/560 ━━━━━━━━━━━━━━━━━━━━ 0s 995us/step - accuracy: 0.9574 - loss: 0.1103355/560 ━━━━━━━━━━━━━━━━━━━━ 0s 996us/step - accuracy: 0.9575 - loss: 0.1097407/560 ━━━━━━━━━━━━━━━━━━━━ 0s 991us/step - accuracy: 0.9575 - loss: 0.1094460/560 ━━━━━━━━━━━━━━━━━━━━ 0s 988us/step - accuracy: 0.9573 - loss: 0.1095516/560 ━━━━━━━━━━━━━━━━━━━━ 0s 978us/step - accuracy: 0.9572 - loss: 0.1096560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9571 - loss: 0.1096 - val_accuracy: 0.9531 - val_loss: 0.1304
Epoch 20/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 24s 43ms/step - accuracy: 1.0000 - loss: 0.0568 48/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9538 - loss: 0.1171   89/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9552 - loss: 0.1151142/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9566 - loss: 0.1123188/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9569 - loss: 0.1112238/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9573 - loss: 0.1099288/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9575 - loss: 0.1087335/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9577 - loss: 0.1081380/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9577 - loss: 0.1077427/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9577 - loss: 0.1077475/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9575 - loss: 0.1078521/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9575 - loss: 0.1078560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9574 - loss: 0.1079 - val_accuracy: 0.9524 - val_loss: 0.1295
Epoch 21/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 17s 32ms/step - accuracy: 1.0000 - loss: 0.0551 39/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9528 - loss: 0.1161   78/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9549 - loss: 0.1136120/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9566 - loss: 0.1113167/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9575 - loss: 0.1099222/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9581 - loss: 0.1085276/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9585 - loss: 0.1072328/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9588 - loss: 0.1064378/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9588 - loss: 0.1060426/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9588 - loss: 0.1059482/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9586 - loss: 0.1060535/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9586 - loss: 0.1060560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9585 - loss: 0.1060 - val_accuracy: 0.9522 - val_loss: 0.1287
Epoch 22/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 13s 25ms/step - accuracy: 1.0000 - loss: 0.0521 47/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9525 - loss: 0.1130  101/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9552 - loss: 0.1105155/560 ━━━━━━━━━━━━━━━━━━━━ 0s 998us/step - accuracy: 0.9569 - loss: 0.1084210/560 ━━━━━━━━━━━━━━━━━━━━ 0s 975us/step - accuracy: 0.9576 - loss: 0.1072265/560 ━━━━━━━━━━━━━━━━━━━━ 0s 961us/step - accuracy: 0.9583 - loss: 0.1058321/560 ━━━━━━━━━━━━━━━━━━━━ 0s 949us/step - accuracy: 0.9588 - loss: 0.1048373/560 ━━━━━━━━━━━━━━━━━━━━ 0s 952us/step - accuracy: 0.9590 - loss: 0.1043423/560 ━━━━━━━━━━━━━━━━━━━━ 0s 959us/step - accuracy: 0.9591 - loss: 0.1041477/560 ━━━━━━━━━━━━━━━━━━━━ 0s 955us/step - accuracy: 0.9591 - loss: 0.1042532/560 ━━━━━━━━━━━━━━━━━━━━ 0s 951us/step - accuracy: 0.9591 - loss: 0.1042560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9591 - loss: 0.1043 - val_accuracy: 0.9535 - val_loss: 0.1283
Epoch 23/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 12s 23ms/step - accuracy: 1.0000 - loss: 0.0491 48/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9549 - loss: 0.1109   96/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9569 - loss: 0.1088149/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9585 - loss: 0.1067202/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9591 - loss: 0.1056253/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9597 - loss: 0.1043304/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9602 - loss: 0.1033352/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9604 - loss: 0.1027407/560 ━━━━━━━━━━━━━━━━━━━━ 0s 997us/step - accuracy: 0.9606 - loss: 0.1024463/560 ━━━━━━━━━━━━━━━━━━━━ 0s 987us/step - accuracy: 0.9605 - loss: 0.1024510/560 ━━━━━━━━━━━━━━━━━━━━ 0s 995us/step - accuracy: 0.9605 - loss: 0.1025553/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9604 - loss: 0.1025  560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9604 - loss: 0.1025 - val_accuracy: 0.9535 - val_loss: 0.1276
Epoch 24/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 16s 30ms/step - accuracy: 1.0000 - loss: 0.0487 48/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9558 - loss: 0.1086  106/560 ━━━━━━━━━━━━━━━━━━━━ 0s 976us/step - accuracy: 0.9581 - loss: 0.1065160/560 ━━━━━━━━━━━━━━━━━━━━ 0s 967us/step - accuracy: 0.9593 - loss: 0.1048216/560 ━━━━━━━━━━━━━━━━━━━━ 0s 954us/step - accuracy: 0.9600 - loss: 0.1036265/560 ━━━━━━━━━━━━━━━━━━━━ 0s 966us/step - accuracy: 0.9606 - loss: 0.1025322/560 ━━━━━━━━━━━━━━━━━━━━ 0s 952us/step - accuracy: 0.9611 - loss: 0.1015372/560 ━━━━━━━━━━━━━━━━━━━━ 0s 960us/step - accuracy: 0.9612 - loss: 0.1010424/560 ━━━━━━━━━━━━━━━━━━━━ 0s 961us/step - accuracy: 0.9613 - loss: 0.1008463/560 ━━━━━━━━━━━━━━━━━━━━ 0s 990us/step - accuracy: 0.9612 - loss: 0.1009508/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9612 - loss: 0.1009  560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9611 - loss: 0.1010 - val_accuracy: 0.9531 - val_loss: 0.1272
Epoch 25/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 15s 27ms/step - accuracy: 1.0000 - loss: 0.0461 55/560 ━━━━━━━━━━━━━━━━━━━━ 0s 944us/step - accuracy: 0.9573 - loss: 0.1068112/560 ━━━━━━━━━━━━━━━━━━━━ 0s 917us/step - accuracy: 0.9594 - loss: 0.1048165/560 ━━━━━━━━━━━━━━━━━━━━ 0s 927us/step - accuracy: 0.9603 - loss: 0.1033208/560 ━━━━━━━━━━━━━━━━━━━━ 0s 981us/step - accuracy: 0.9607 - loss: 0.1024253/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9613 - loss: 0.1013  302/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9617 - loss: 0.1003348/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9619 - loss: 0.0997401/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9621 - loss: 0.0994449/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9620 - loss: 0.0994500/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9620 - loss: 0.0994552/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9619 - loss: 0.0995560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9619 - loss: 0.0995 - val_accuracy: 0.9531 - val_loss: 0.1272
Epoch 26/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 13s 25ms/step - accuracy: 1.0000 - loss: 0.0441 49/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9571 - loss: 0.1054   98/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9592 - loss: 0.1038146/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9604 - loss: 0.1022194/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9608 - loss: 0.1013248/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9614 - loss: 0.1000305/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9619 - loss: 0.0988355/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9621 - loss: 0.0982410/560 ━━━━━━━━━━━━━━━━━━━━ 0s 992us/step - accuracy: 0.9622 - loss: 0.0979457/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9622 - loss: 0.0980  504/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9621 - loss: 0.0980554/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9621 - loss: 0.0981560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9621 - loss: 0.0981 - val_accuracy: 0.9542 - val_loss: 0.1274
Epoch 27/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 15s 27ms/step - accuracy: 1.0000 - loss: 0.0429 55/560 ━━━━━━━━━━━━━━━━━━━━ 0s 953us/step - accuracy: 0.9564 - loss: 0.1035 99/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9585 - loss: 0.1022  150/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9599 - loss: 0.1007198/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9605 - loss: 0.0998249/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9611 - loss: 0.0986297/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9616 - loss: 0.0976349/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9620 - loss: 0.0970401/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9622 - loss: 0.0966453/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9621 - loss: 0.0966502/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9621 - loss: 0.0967554/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9621 - loss: 0.0968560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9621 - loss: 0.0968 - val_accuracy: 0.9549 - val_loss: 0.1277
Epoch 28/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 13s 25ms/step - accuracy: 1.0000 - loss: 0.0414 47/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9560 - loss: 0.1022   95/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9583 - loss: 0.1011145/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9599 - loss: 0.0995196/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9606 - loss: 0.0986245/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9612 - loss: 0.0975293/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9617 - loss: 0.0964342/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9621 - loss: 0.0958392/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9623 - loss: 0.0954442/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9623 - loss: 0.0954489/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9623 - loss: 0.0955536/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9623 - loss: 0.0955560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9623 - loss: 0.0956 - val_accuracy: 0.9546 - val_loss: 0.1274
Epoch 29/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 15s 27ms/step - accuracy: 1.0000 - loss: 0.0406 48/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9575 - loss: 0.1003  100/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9595 - loss: 0.0992146/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9606 - loss: 0.0979197/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9612 - loss: 0.0971242/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9617 - loss: 0.0961292/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9622 - loss: 0.0951341/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9625 - loss: 0.0945388/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9627 - loss: 0.0941437/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9627 - loss: 0.0941486/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9627 - loss: 0.0942528/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9627 - loss: 0.0942560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9627 - loss: 0.0943 - val_accuracy: 0.9551 - val_loss: 0.1274
Epoch 30/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 18s 32ms/step - accuracy: 1.0000 - loss: 0.0393 53/560 ━━━━━━━━━━━━━━━━━━━━ 0s 979us/step - accuracy: 0.9577 - loss: 0.0988103/560 ━━━━━━━━━━━━━━━━━━━━ 0s 995us/step - accuracy: 0.9599 - loss: 0.0977154/560 ━━━━━━━━━━━━━━━━━━━━ 0s 992us/step - accuracy: 0.9613 - loss: 0.0964209/560 ━━━━━━━━━━━━━━━━━━━━ 0s 974us/step - accuracy: 0.9618 - loss: 0.0956265/560 ━━━━━━━━━━━━━━━━━━━━ 0s 960us/step - accuracy: 0.9625 - loss: 0.0943314/560 ━━━━━━━━━━━━━━━━━━━━ 0s 969us/step - accuracy: 0.9629 - loss: 0.0935359/560 ━━━━━━━━━━━━━━━━━━━━ 0s 989us/step - accuracy: 0.9631 - loss: 0.0930410/560 ━━━━━━━━━━━━━━━━━━━━ 0s 988us/step - accuracy: 0.9632 - loss: 0.0928462/560 ━━━━━━━━━━━━━━━━━━━━ 0s 985us/step - accuracy: 0.9631 - loss: 0.0929513/560 ━━━━━━━━━━━━━━━━━━━━ 0s 987us/step - accuracy: 0.9631 - loss: 0.0929557/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9631 - loss: 0.0930  560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9631 - loss: 0.0930 - val_accuracy: 0.9553 - val_loss: 0.1278
Epoch 31/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 15s 28ms/step - accuracy: 1.0000 - loss: 0.0378 44/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9584 - loss: 0.0977   95/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9601 - loss: 0.0970147/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9615 - loss: 0.0955198/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9620 - loss: 0.0948246/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9625 - loss: 0.0937302/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9630 - loss: 0.0926351/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9633 - loss: 0.0921403/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9634 - loss: 0.0917455/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9634 - loss: 0.0918513/560 ━━━━━━━━━━━━━━━━━━━━ 0s 985us/step - accuracy: 0.9634 - loss: 0.0919559/560 ━━━━━━━━━━━━━━━━━━━━ 0s 994us/step - accuracy: 0.9634 - loss: 0.0919560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9634 - loss: 0.0919 - val_accuracy: 0.9553 - val_loss: 0.1284
Epoch 32/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 14s 26ms/step - accuracy: 1.0000 - loss: 0.0360 49/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9592 - loss: 0.0967   97/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9604 - loss: 0.0960147/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9616 - loss: 0.0946202/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9621 - loss: 0.0937258/560 ━━━━━━━━━━━━━━━━━━━━ 0s 982us/step - accuracy: 0.9627 - loss: 0.0925303/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9631 - loss: 0.0916  343/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9633 - loss: 0.0911383/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9635 - loss: 0.0908430/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9635 - loss: 0.0907482/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9635 - loss: 0.0908537/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9634 - loss: 0.0908560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9634 - loss: 0.0909 - val_accuracy: 0.9553 - val_loss: 0.1287
Epoch 33/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 16s 29ms/step - accuracy: 1.0000 - loss: 0.0353 47/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9604 - loss: 0.0951   97/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9612 - loss: 0.0947149/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9622 - loss: 0.0933202/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9625 - loss: 0.0925258/560 ━━━━━━━━━━━━━━━━━━━━ 0s 989us/step - accuracy: 0.9631 - loss: 0.0913309/560 ━━━━━━━━━━━━━━━━━━━━ 0s 990us/step - accuracy: 0.9635 - loss: 0.0904359/560 ━━━━━━━━━━━━━━━━━━━━ 0s 994us/step - accuracy: 0.9637 - loss: 0.0898412/560 ━━━━━━━━━━━━━━━━━━━━ 0s 989us/step - accuracy: 0.9638 - loss: 0.0895467/560 ━━━━━━━━━━━━━━━━━━━━ 0s 980us/step - accuracy: 0.9638 - loss: 0.0896520/560 ━━━━━━━━━━━━━━━━━━━━ 0s 978us/step - accuracy: 0.9637 - loss: 0.0897560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9637 - loss: 0.0898 - val_accuracy: 0.9551 - val_loss: 0.1291
Epoch 34/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 14s 25ms/step - accuracy: 1.0000 - loss: 0.0355 43/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9602 - loss: 0.0943   87/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9611 - loss: 0.0944132/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9623 - loss: 0.0928181/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9626 - loss: 0.0920230/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9631 - loss: 0.0909281/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9635 - loss: 0.0899329/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9639 - loss: 0.0892375/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9641 - loss: 0.0887430/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9642 - loss: 0.0886484/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9642 - loss: 0.0887537/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9642 - loss: 0.0887560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9642 - loss: 0.0888 - val_accuracy: 0.9560 - val_loss: 0.1298
Epoch 35/2000
  1/560 ━━━━━━━━━━━━━━━━━━━━ 13s 24ms/step - accuracy: 1.0000 - loss: 0.0340 47/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9614 - loss: 0.0930   98/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9622 - loss: 0.0928143/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9630 - loss: 0.0917183/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9632 - loss: 0.0912227/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9636 - loss: 0.0902268/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9640 - loss: 0.0893319/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9644 - loss: 0.0885367/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9647 - loss: 0.0880417/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9648 - loss: 0.0877471/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9648 - loss: 0.0878523/560 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - accuracy: 0.9648 - loss: 0.0879560/560 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step - accuracy: 0.9648 - loss: 0.0879 - val_accuracy: 0.9555 - val_loss: 0.1298
  1/175 ━━━━━━━━━━━━━━━━━━━━ 6s 40ms/step 79/175 ━━━━━━━━━━━━━━━━━━━━ 0s 644us/step165/175 ━━━━━━━━━━━━━━━━━━━━ 0s 619us/step175/175 ━━━━━━━━━━━━━━━━━━━━ 0s 905us/step175/175 ━━━━━━━━━━━━━━━━━━━━ 0s 985us/step
Large Dataset Neural Network
Accuracy: 0.9533345252994815
Precision: 0.9536931818181819
Recall: 0.9716353111432706
F1: 0.9625806451612903

Difference of scores before and after adding data:
Accuracy Increase: 0.05414185674176819
Precision Increase: 0.05092004449776755
Recall Increase: 0.03115497699777814
F1 Increase: 0.0413395950419585

STRETCH QUESTION|TASK 3

Can you build a model that predicts the year a house was built? Explain the model and the evaluation metrics you would use to determine if the model is good.

I decided to use the XGBRegressor for this problem because it is fairly easy to build and the XGBoost seemed to perform about as well as the Random Forest in the classification problem, so I figured it would do fairly well on the regression problem as well. I decided to do some hyperparameter tuning with a Grid Search (I commented out the GridSearchCV code because it takes a long time to run and I only needed to run it once). After that I got an MSE of 12.6 (ish) which suggests that the model was predicting within 12.6 years of the actual value (so if it guessed a certain home was built in 1983, the actual value could be +-12.6 years of that). It performed with an MAE of 6.9 (ish), which means that on average the model predicted 6.9 years off of the actual value. Lastly the R^2 value is difficult to explain, but put simply it is a measure of how well the model explains the variance in the yrbuilt values. So 88.5% of the variation in the model is explained by the model while the remaining 11.5% variance is due to unexplained factors (a higher R^2 score is better, with 1 being the best).

Show the code
from sklearn.model_selection import GridSearchCV
from sklearn.metrics import mean_squared_error, mean_absolute_error, r2_score

X_reg = large_df.drop(columns = ['parcel', 'yrbuilt', 'before1980'])
y_reg = large_df.yrbuilt

X_reg_train, X_reg_test, y_reg_train, y_reg_test = train_test_split(X_reg, y_reg, test_size=0.2, random_state=42)

# param_grid = {
#     'n_estimators': [100, 300, 500],
#     'learning_rate': [0.01, 0.1, 0.2],
#     'max_depth': [3, 5, 7],
#     'subsample': [0.7, 0.8, 1.0],
#     'colsample_bytree': [0.7, 0.8, 1.0]
# }
# search = GridSearchCV(XGBRegressor(), param_grid, cv=5, scoring='neg_mean_squared_error', n_jobs=-1)
# search.fit(X_reg_train, y_reg_train)
# best_model = search.best_estimator_
# print("Best parameters:", search.best_params_)

regr = XGBRegressor(colsample_bytree=0.7,learning_rate=0.2,max_depth=7,n_estimators=500,subsample=0.8)
regr.fit(X_reg_train, y_reg_train)
reg_pred = regr.predict(X_reg_test)

rmse = np.sqrt(mean_squared_error(y_reg_test, reg_pred))
mae = mean_absolute_error(y_reg_test, reg_pred)
r2 = r2_score(y_reg_test, reg_pred)

print(f"RMSE: {rmse}")
print(f"MAE: {mae}")
print(f"R^2: {r2}")
RMSE: 12.637653317450606
MAE: 6.895981788635254
R^2: 0.8850180506706238